Close
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
Translating data protection into practice: exploring gaps in data privacy and cybersecurity within organizations
(USC Thesis Other)
Translating data protection into practice: exploring gaps in data privacy and cybersecurity within organizations
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
TRANSLATING DATA PROTECTION INTO PRACTICE:
EXPLORING GAPS IN DATA PRIVACY AND CYBERSECURITY WITHIN
ORGANIZATIONS
by
Jillian K. Kwong
A Dissertation Presented to the
FACULTY OF THE USC GRADUATE SCHOOL
UNIVERSITY OF SOUTHERN CALIFORNIA
In Partial Fulfillment of the
Requirements for the Degree
DOCTOR OF PHILOSOPHY
(COMMUNICATION)
August 2022
Copyright 2022 Jillian K. Kwong
DEDICATION
ii
Dedication
To Mom,
You never gave up on me and more importantly never let me give up on myself.
Thank you.
ACKNOWLEDGEMENTS
iii
Acknowledgements
Years ago my mom told me something that has stuck with me ever since. “I’ve never
worried about you because you always manage to surround yourself with good people. “ As I
reflect on this journey, I can’t help but be reminded of the wisdom in her words. This journey
wouldn’t be possible without a myriad of people who have helped me along the way. I like to say
Mrs. Hockwalt was my first “writing coach” starting in elementary school while Mr. Shirota was
my first “dissertation chair.” Unbeknownst to all of us, they began to lay the foundation with my
mom that would eventually lead me to USC.
CSULA
It is no secret that my transition to college was arduous at best. At 15, I struggled to find
myself and adjust to the academic demands put on those in my program to succeed. I crumbled
under the pressure and for many years I hated school and I regretted the decision to go to college
early. However, my saving grace was the amazing people I found at California State University
Los Angeles (CSULA). My faculty mentors and life-long friends not only helped me rise to the
occasion but even managed to make the experience enjoyable. To my EEP family (Arnie
Sanchez, Ajia Montgomery, Sunny Patel, Roy Cheng, Sudipta and Aditya Mohanty), thank you
for accepting me and never making me feel “less than” even when I couldn’t seem to do
anything right. Jaemy Garcia and Juan Carlos Huezo, thank you for always looking out and
cheering me on every step of the way. To my MA cohort (special shout out to Melissa Tindage)
and Comm crew (Robert Allen and Mylen Yamamoto Tansingco), your friendship and sage
advice over the years helped me grow as a scholar, educator, and person.
ACKNOWLEDGEMENTS
iv
There is no doubt in my mind that I would not be here without the faculty at CSULA.
Whether it was professors in my EEP core classes, political science, or communication studies,
thank you for always taking a little extra time out of your day to help this struggling student find
her way. Special thanks to Dr. Bellman and Dr. Broeckelman-Post who put me on a path to
getting a PhD by providing the training, opportunities, and mentorship required to move to the
next level. Dr. Chao, thank you for keeping me well fed and entrusting me to carry on your
Trojan legacy under Peggy. Dr. Olsen, it brings me so much joy to share this full circle moment
with you. As you know, your undergrad honors public speaking course traumatized me and I
vowed never to take another communication course again. I still cannot believe you managed to
convince me to get a masters degree, teach courses, and pursue a PhD (all in communication).
Your unwavering belief in me since I was 17 has single handedly changed the course of my life.
Without you and your favorite phrase “it’s not like you have anything better to do with your time,
do you?” there would be no USC and definitely no PhD in communication.
Peggy McLaughlin
Just like with everything else in my life, my time at USC has been defined by the
extremely kind and generous people I had the privilege of getting to know and work with over
the years. To my advisor, Margaret McLaughlin, you have been my staunchest advocate and
fiercest supporter since I entered this program. I have unequivocal trust in you and your
judgment because you have always had my back and advised me based on what is best for me.
Never once did you try to make me something I was not. Even when it was clear I was not going
to be a traditional health Comm scholar, you gave me the freedom to explore my interests while
offering guidance, thoughtful advice, direction, and the occasional reality check to rein me back
in when I needed it. What I have come to admire most about you is your character. I often think
ACKNOWLEDGEMENTS
v
back to one of our first conversations in your office when you made it clear that you were not
trying to create new “Peggy McLaughlin’s.” You had no intention of using your students to gain
publications or further your career in any way. “This was your PhD,” you kept saying and you
saw your role as helping me become the best scholar I could be, not the other way around.
The one thing you did ask was that I make two promises. The first was that no matter
what happened I would remember that I was a person first, scholar second. You expected me to
make decisions based on whether they were right for me, not if they would make you or anyone
else happy. Second, I was never to apologize for putting my health and wellbeing first because
“there is always more work but there is only one Jillian.” True to your word, you led by example
and made sure I prioritized myself and my interests throughout the program.
Colin Maclay
In a similar vein, Colin Maclay, you have been my rock advising me through the
dissertation to the finish line. You stepped in when I had to start over from scratch after the
pandemic hit and generously met with me every week for two years, listening to my ups and
downs, celebrating successes, and picking me up when I stumbled and fell. Even during the most
challenging weeks/months, you never let me get too deep in my head or fall into a dark place. I
always came away from our weekly meetings feeling seen, inspired, understood, and supported.
More importantly, you taught me how to navigate through these roadblocks, which gave me the
skills and wherewithal to replicate this process if I ever found myself or others in a similar
situation.
One thing I admire about you is the kind and gentle way you pushed me to reach my full
potential. I remember during one of our meetings laying out a really simple yet ambitious
argument that I needed help paring back. You were confused why I was so insistent on playing it
ACKNOWLEDGEMENTS
vi
safe since I was confident in the argument and had the evidence to support it. I said, “I’m a
junior scholar and to do this I’d need to go toe-to-toe with some of the biggest names in the field.
I’m not at a point where I can push boundaries or challenge the status quo. That has to wait until
I have tenure.” You said you respected my decision but disagreed that the work couldn’t be done
now. If I wanted, you would help me make it happen and the rest is history.
Thank you for not letting me take the safe route because in doing so I learned so many
valuable life lessons like how to navigate difficult spaces diplomatically, strategies for pushing
boundaries, and to not place artificial limits on myself or abilities. You gave me the tools and
confidence to reach my full potential and do what I’m capable of, not just what is “safe.” No
matter what lies ahead, I couldn’t be prouder of the person and scholar I am today.
Leslie Saxon
Leslie Saxon, I am eternally grateful that you decided to take a chance on a 2
nd
year PhD
student pitching an ambitious new research area out of the blue. You pushed me to think bigger,
to look beyond the problems of today or even 5-10 years from now to 20-30 years down the road.
This forward-thinking vision combined with an insistence on practicality, real-world impact, and
deliverables forced me to step outside the traditional academic mindset and grow as a researcher.
In true Leslie Saxon-style, you embraced me with open arms and we’ve been working together
ever since. Some of my best USC memories are hanging out with former CBC team members
(Becca Ebert, Ketetha Olengue, Brittain Bush, Cynthia Romero, Mona Sobhani, and Lauren
Espy) at ICT, cake day, Las Vegas, the Jonathan Club, and Shutters on the Beach.
USC Annenberg Faculty and Staff
To Patricia Riley and Sheila Murphy, each of you contributed to my growth as a scholar
providing valuable knowledge, expertise, and training required to operate at the highest level of
ACKNOWLEDGEMENTS
vii
academia. Your mentorship did not stop in the classroom as each of you continued to work with
me to hone my research skills as committee members and even co-authors. Patti and Tom, you
have been in my corner since before day one, supporting me when I was just a CSULA student
who dreamed of one day attending USC Annenberg. I remember how proud you were to
welcome me to USC and I cannot thank you enough for always looking out for me. Thank you to
Lynn Miller who spent hours working with me during office hours to help me troubleshoot
whatever crazy project I happened to be working on at the moment. Peter Monge, it was a
privilege to have you as a professor and director of the doctoral program. You always went
above and beyond, getting to know every student personally, checking in, following up, and
fighting to get us access to every resource available across USC and Annenberg. You took it
upon yourself to make sure every student was taken care of and create a network of PhD students
that resulted in an overwhelmingly positive experience for me. Josh Kun, you had every right to
be upset at me that one time I didn’t do the reading during first year core. Instead of taking it
personally, holding a grudge, or writing me off, you showed compassion and sat me down to ask
1) what was wrong and 2) how you could help. That meeting marked a major turning point in my
career because for the first time I allowed myself (with lots of your encouragement) to forget
about grades and take a risk writing about a crazy idea I was weirdly passionate
about…biometric data. I didn’t know it at the time but this would eventually burgeon into my
current research so thank you for providing the initial spark that started it all.
Thank you to Larry Gross, Arlene Luck, François Bar, and Annenberg Press for
entrusting me with the management of not just one but two flagship journals over the course of
five years. The magnitude of this opportunity was not lost on me as I learned the ins and outs of
ACKNOWLEDGEMENTS
viii
the academic publishing process from beginning to end from you and ITID’s Associate Editors.
Arlene, I have loved getting to know you and cherish our weekly chats in your office.
Although I didn’t take classes with them in the formal sense, there were many individuals
who supported me from afar, generously taking time out of their day to meet with me, listen,
provide guidance, or simply check-in and see how I was doing. These include (but are not
limited to) Mark Lloyd, Michael Cody, Janet Fulk, Mike Ananny, Marlon Twyman, Dan Durbin,
Tom Hollihan, Jonathan Aronson, Alison Trope, and Jillian Pierson. Thank you to the
Annenberg staff who have saved me countless times (Sarah Holterman, Anne Marie Campian,
Christine Lloreda, and Troy Mikanovich). Thank you also to Geoffrey Cowan, Adam Clayton
Powell III, Susan Goelz, Judy Kang, and CCLP for taking me in this past year and providing me
the opportunity to pursue my interest in cybersecurity.
USC Friends and Support Network
Thank you to my Annenberg buddies Lik Sam (Xam) Chan and Nathan Walter who took
me under their wing day one and still to this day provide valuable friendship, guidance, and
support that has carried me through this program and beyond. To my cohort, I am forever
grateful for the time we spent together. Getting to see each other grow and mature as individuals
and scholars as you start families and embark on exciting careers is something I will cherish
along with the many laughs, struggles, lighthearted disagreements, and good times. Thank you to
my friends Ignacio Cruz, Azeb Madebo, Calvin Kim, Tyler Sarafian-Hiebert, Traci Gillig, Sarah
Clayton, Briana Ellerbe, Lauren Levitt, Emily Sidnam-Mauch , Ming Curran, Stefi Demetriades,
Deborah Neffa Creech, Kate Miltner, Matt Bui, Sophia Baik, Hye Min Kim, TJ Billard, Donna
Kim, Rachel Moran, and so many more for the wonderful hours of conversation and camaraderie
we have shared over the years. A special thank you to my dissertation-writing group (Azeb,
ACKNOWLEDGEMENTS
ix
LaToya Council, Bri, Diami Virgilio, Ignacio, and Karina Santellano), without which I would
have never finished. Our late night Zoom sessions, chatting, laughing, teasing, venting,
commiserating, and bouncing ideas have made what can easily be an extremely lonely, dark, and
isolating process one of the most fun, cathartic, and productive periods of my life.
I would be remiss not to acknowledge those outside of USC who have contributed to my
development as a scholar. Rajorshi Ray, our weekly chats have been a source of inspiration and
comfort especially throughout the pandemic. Thank you for always checking up on me and
taking the time to help me theorize and solve whatever “above our pay grade” problem we
happened to land on that week. A huge thank you to Richmond Wong, Collen Chambwera,
Jessica Santana, Amit Rushi, and Felicia Pratt for always being so generous with your time,
friendship, and advice.
Family and Friends
While I don’t like to think about the sacrifices I made in order to complete this degree,
there is one that I do think about often. My grandma, Mary, started this journey with me in 2015
but passed away during my second year while I was in DC doing COMPASS. Grandma, I’m
sorry I wasn’t able to be with you during your final days. It broke my heart that I had to leave
and I’ll never forget how much I hated making (and living with) that decision. It was the only
time in seven years that I truly questioned whether this degree was worth it but I know you
would have told me to carry on because you wouldn’t have had it any other way.
To my extended family (Uncle Paul, the Browns, Gressanis, and Lees) and friends
outside of academia (Kathy Ngo and Kevin Mass, Juan Carlos Huezo, Linda Ogawa, Cris Law,
Sara Muellner, Sudipta and Aditya Mohanty, Andrew Esguerra, and Bradley McLeod) thank you
ACKNOWLEDGEMENTS
x
for keeping me grounded and always providing a much needed respite from the pressures of
work.
To my immediate family, thank you dad for passing on your work ethic and
demonstrating what it means to be a professional. For as long as I can remember, you left home
at 5am and worked 10-12 hours every day. I realize now you didn’t do that because you wanted
to but because that time away from home allowed you to give us the future you wanted us to
have. Derek, you are one of the only people who I genuinely believe could care less about
accolades or that I was even in (or graduated from) a PhD program. It is weirdly comforting to
know that despite how much life changes, some things never change. Tiffany (aka my “lifelong
academic advisor”) I don’t tell you enough how much I admire and look up to you. It is no secret
I have followed in your footsteps. You helped me navigate college at 15 and have been
indispensable throughout graduate school. Your two best pieces of advice to 1) get to know my
classmates and 2) pick an advisor based more on how kind and supportive they were rather than
what they could offer me have made my PhD experience what it was. You are, and have always
been, the smartest, most capable person I know and I don’t give you nearly enough credit for all
that you have accomplished while blazing the trail for me. Kelsey, these days our visits have
become few and far between yet your support has never wavered. I’ve loved getting to share a bit
of my life at USC with you and I appreciate you always taking the time to listen, make me laugh,
and being your wonderfully goofy self.
Last but not least, mom, none of this would be possible without you. You did everything
you could to find me programs that fit and for a long time it seemed like nothing did. I know it
must have been so hard to see me struggle over and over again. There were so many fights and
tears and a lot of regret for many years but thank you for never giving up on me and continuing
ACKNOWLEDGEMENTS
xi
to try. Thank you for being my biggest supporter, my confidant, sounding board, therapist, and
moral compass when needed. You managed to keep me grounded yet still outrageously
ambitious, helping me navigate the dark times, reminding me to enjoy the good times, all while
appreciating everything in between. You have taught me to find joy in the journey not the end
goal, to get comfortable with uncertainty, to embrace the unknown, and not be afraid to start over
when I find myself down a path that is no longer right for me. I aspire to emulate your magnetic
personality and one day inspire others by becoming half the person you are with your patience,
strength, humor, dedication, determination, compassion, independence, tenacity, thoughtfulness,
and humility.
TABLE OF CONTENTS xii
Table of Contents
Dedication ii
Acknowledgements iii
List of Figures xiii
Abstract xiv
Chapter 1: Introduction 1
Chapter 2: Translating Data Protection into Practice (Study 1) 4
Chapter 3: The Evolution of the Compliance Landscape (Study 2) 44
Chapter 4: Cybersecurity in Practice (Study 3) 87
Chapter 5: Discussion 117
Chapter 6: Conclusion 124
References 128
Appendix 1: Interviewee Demographics 143
Appendix 2: Study 3 Interviewee Demographics 145
LIST OF FIGURES xiii
List of Figures
Figure 1: Study 1 Themes 16
Figure 2: Study 2 Themes 64
Figure 3: Study 3 Themes 95
ABSTRACT xiv
Abstract
This dissertation explored how data privacy and security was implemented and
reproduced within small to medium-sized organizations in the United States. Building on
existing literature that documented the experience of privacy practitioners ‘on the ground,’ this
project sought to add to the conversation by elevating the study of human behavior as the focal
point of the research. Unlike previous scholarship which centered its focus on privacy law (e.g.
how law and policy were altered from their original intent by humans), this project prioritized the
study of human decision-making and behavior to investigate the reciprocal relationship between
people and their institutional environments (e.g. how individuals interacted and influenced the
institutional structures around them while simultaneously being impacted by these same
structures). Approaching this research from a communication perspective allowed the project to
transcend the established boundaries of the field by bringing together multiple bodies of
knowledge and perspectives. The complexity of working at the intersection of multiple domains
required this dissertation to push beyond conventional disciplinary structures since phenomena
could not be fully understood or contained within traditional academic silos. Using semi-
structured interviews, this dissertation argued for greater emphasis to be placed on understanding
the everyday actions, experiences, dilemmas, and contradictions encountered by practitioners as
they attempted to enact organizational change around data. Insights from this research made a
case for exploring data privacy and security from an embedded agency perspective. Additional
understandings regarding the constraints posed by institutional structures and regulatory
environments opened up opportunities to examine how future legislation and efforts to protect
data in the United States can be improved.
CHAPTER 1: INTRODUCTION
1
Chapter 1: Introduction
While there are many opinions about whether data really is the ‘new oil’ of the digital
economy, the bottom line is data in society is messy. This is especially true in the United States
where data is regulated by sector. This means the same piece of information could be treated
very differently depending on context it was collected leading to confusion and misuse of data.
As countries scramble to pass laws in an attempt to adjust to the increasing digitization of
everyday life and society, the United States continues to be defined by fragmented, sectoral laws
that vary considerably by state and industry. As of June 2022, five states have passed
comprehensive privacy laws while others continue to enact sector-specific legislation (Lively,
2022). The inconsistencies in definitions and differing rules across states pose major challenges
not just to individuals and government regulators but companies operating in multiple
jurisdictions across the country.
This dissertation explored this phenomena, specifically how data practices in the United
States evolved in response to changing environments and compliance rules. Even though certain
laws in the United States protected data collected in narrowly defined contexts, the ability to
gather, aggregate, process, and share various kinds of data meant new types of data were always
being created that often existed outside of government regulation. With many of the traditional
boundaries separating different aspects of people’s lives disappearing, there was a need to
understand how society would adapt and move forward since many of these legacy systems and
frameworks were not designed or built to accommodate context collapse.
Two questions in particular motivated the dissertation: 1) how did data privacy and
security get enacted and reproduced within organizations and 2) how should data practices and
CHAPTER 1: INTRODUCTION
2
government regulation evolve in order to meet the demands of tomorrow? The goal was to shed
light on the various forces and structures affecting human behavior including context collapse,
interwoven systems, recursive practices and processes, value systems, and incentive models.
Drawing on theoretical traditions of privacy law and institutional theory (from the field of
organization studies) this research explored the dynamic forces that affected data privacy and
security in practice. Interviews with privacy lawyers, chief privacy officers, information security
experts, and other practitioners, showed how traditional ways of defining, measuring, and
protecting data and privacy failed to account for the existing values, assumptions, organizational
realities, and practices ingrained and reinforced by the US sectoral approach to privacy law.
Although agency was touted as a fundamental value of US culture and privacy law, the ability to
exercise agency was limited even for practitioners tasked with managing data and risk within
organizations. Deeper analyses of practitioner accounts revealed recursive practices and
problematic incentive models that were characterized by the performance of compliance and
forced trust in technology and information systems. Given the recent push in California and
across the United States for stronger data protection laws, this line of research was especially
important for closing gaps that characterized the patchwork approach to US privacy policy.
Preview of Chapters
The dissertation begins with an introduction and overview of the dissertation. Chapter 2
provided an overview of methods used throughout the dissertation and documented the
experiences of privacy practitioners at small to medium-sized enterprises (SMEs) as they
navigated the changing regulatory environment in the United States. Using an inductive, bottom-
up approach to coding interview data, this study (Study 1) looked at how data privacy laws were
translated into practice. Study 1 results discuss how companies conceptualized privacy within
CHAPTER 1: INTRODUCTION
3
organizations and adjusted their practices in response to new compliance mandates. Chapter 3
examined the impact of regulation (i.e. the California Consumer Privacy Act) on the
implementation of privacy within organizations. This study (Study 2) focused on the internal
organizational dynamics related to culture change and compliance mandates. Unlike Study 1,
which did not have any preconceived theory or constructs guiding analysis, Study 2 built on
existing themes in the literature and utilized a deductive, top-down approach to coding data.
Results identified institutional structures and recursive practices that created, reinforced, and
reproduced mentalities, social norms, and practices that complicated the adoption of more
progressive privacy protections within organizations. Chapter 4 was a pilot study (Study 3) that
investigated the factors impeding the development of cybersecurity within organizations. This
research differed from Studies 1 and 2 as it sought to highlight narratives, issues, and challenges
specific to the cybersecurity field that were separate from privacy. This study used a subset of
interviews from the main corpus of data to look at the routine taken-for-granted assumptions,
values, collective practices, and patterns of behavior that impacted how security was prioritized
and implemented in practice. Although related, privacy and security represent distinct disciplines
both in academic literature and practice. Cybersecurity as a field was normally very technical
focusing on mechanisms that controlled and prevented unauthorized access to data. Privacy on
the other hand was much more abstract by nature and had always been deeply intertwined with
larger conceptual issues such as ethics, law, and individual rights. Findings highlighted the
current state of cybersecurity and misalignment in incentive models. Chapter 5 put research
findings into dialogue with current privacy ‘on the ground’ literature highlighting both
divergence and convergence of results. Chapter 6 concluded the dissertation by summarizing
findings, limitations, and directions for future inquiry.
CHAPTER 2: TRANSLATING DATA PROTECTION INTO PRACTICE
4
Chapter 2: Translating Data Protection into Practice (Study 1)
While data privacy and protection have been important topics in western societies
throughout history, the 21
st
century brought new attention and urgency to the issue with data
becoming an increasingly valuable asset for organizations in today’s economy (Andrew & Baker,
2021; Hartzog & Richards, 2020). However, as the collection, use, and sharing of data expands,
so do the risks involved protecting that data. Although the European Union (EU) had already
been working on the General Data Protection Regulation (GDPR) for years, a series of high
profile events impacting consumers in the United States in the 2010s (e.g. Facebook and
Cambridge Analytica scandal, massive data breaches from companies including Experian,
Equifax, Google, LinkedIn, Marriott, etc.) highlighted egregious abuses committed by
companies that heightened public awareness and galvanized support for greater oversight of this
previously unregulated space (Hill & Swinhoe, 2021; Lapowsky, 2019). The GDPR represented
one of the first global comprehensive laws related to data protection. It was designed not just to
introduce standards within its own jurisdiction but catalyze a new era of data governance and
accountability both inside and outside the European Union (Schwartz, 2019). In many ways, the
GDPR fundamentally changed the regulatory landscape of technology and business around the
world by establishing minimum standards and inspiring numerous governments to pass their own
versions of data mandates (Hoofnagle, van Der Sloot, & Borgesius, 2019).
Since enactment of GDPR in 2018, similar laws have been adopted around the world. In
terms of the United States, the most notable examples are the California Consumer Privacy Act
(CCPA) and the California Privacy Rights Act (CPRA). Much like the GDPR, these bills have
generated enormous interest across the country due to their potential to significantly impact
CHAPTER 2: TRANSLATING DATA PROTECTION INTO PRACTICE
5
consumer privacy and the data landscape throughout the United States. However, optimism has
tempered around the GDPR and CCPA as preliminary research has found mixed results in terms
of effectiveness. For example, early studies have criticized the effectiveness of both the GDPR
and CCPA to increase protection of consumer data citing 1) increasing vulnerability of data and
2) the difficulty of exercising privacy rights. Experts point out that despite advances in
legislation and greater regulatory oversight, there has been relatively little improvement in
overall consumer data protection. For instance, the number and cost of breached records
continues to skyrocket year after year (IBM Security, 2021; Lohrmann, 2021) as consumer trust
in companies to protect their personal data has stagnated (Anant, Donchak, Kaplan, & Soller,
2020; Entrust, 2021; IBM Security, 2018). According to Shaty (2021), there were over 870
million records compromised in January 2021 alone (more than the total number of records
compromised in all of 2017). From January 1, 2021 to June 30, 2021 there have been 1,767
publicly reported breaches, which exposed approximately 18.8 billion records (Security
Magazine, 2021). While there are no doubt many successes and positive developments
associated with these pieces of legislation, the numbers suggest there is still quite a ways to go in
terms of securing data and protecting users’ private information.
Additionally, academic research and investigative reporting have also identified
shortcomings in the laws themselves, particularly 1) difficulty exercising rights (King, 2020;
Waddell, 2021a), 2) lack of recourse to address unauthorized surveillance of users (Gundersen,
2020), 3) inability to prevent unnecessary use or sharing of data (Mandalari, Dubois, Kolcun,
Paracha, Haddadi, & Choffnes, 2021), and 4) limited effectiveness of data requests (Waddell,
2021b). This raises a question: how are comprehensive laws like the CCPA being implemented
within organizations in the United States and what factors influence this process? Investigating
CHAPTER 2: TRANSLATING DATA PROTECTION INTO PRACTICE
6
this phenomenon in greater detail should provide policymakers additional context for how
legislation can be improved and perhaps increase consumer data privacy and protection across
the United States.
The State of Data Privacy and Protection in the United States
This study explored the US regulatory environment by taking a closer look at
organizational responses to data privacy mandates. By focusing on how privacy practitioners at
small and medium-sized enterprises (SMEs) translated data privacy laws into practice, this work
shed light on gaps in the United States’ approach to regulation and enforcement. While much
scholarly attention has focused on the laws and legislation governing the data privacy space, this
work adds to the conversation by prioritizing the human interactions taking place on the ground
within organizations. Although lawmakers, privacy advocates, scholars, and the public have
expressed frustration about the pace of change, there has not been extensive communication with
privacy practitioners within industry to understand the difficulty of data privacy compliance.
The motivation for this study was to expand the boundaries of ongoing privacy dialogues
to include additional perspectives that have not received adequate attention within scholarly
literature. Despite groundbreaking work by Bamberger and Mulligan (2011; 2015) and Waldman
(2018; 2020), privacy on the ground is still largely understudied across disciplines. Building on
these approaches, this study explored this space through a communication lens, which provided
ideal entrée to understanding the breakdowns that occurred at the intersection of policy, law,
business, management, and organizational behavior. Bringing the practitioner experience to the
forefront broadened the purview of current debates around privacy by enhancing communication
between stakeholders and raising awareness about the difficulty of regulating data in the United
States. In terms of organization, this chapter first provides context for understanding approaches
CHAPTER 2: TRANSLATING DATA PROTECTION INTO PRACTICE
7
to data privacy and protection in the United States and the European Union. This section begins
with a brief overview of how each approach developed, followed by a discussion of the major
distinguishing characteristics of each model, and summary of study methods. The purpose of
reviewing this literature was to explicate the importance of terminology and clarify the
difference between two terms that are often mistakenly used interchangeably, specifically ‘data
privacy’ and ‘data protection.’ The United States decision to define and codify laws in terms of
data privacy (rather than data protection like the European Union) has major implications for
how these laws converge and diverge in practice. The results section discusses 1) how
practitioners perceive the regulatory environment, 2) how certain aspects of privacy laws interact
with existing organizational realities on the ground, 3) organizational responses to privacy
implementation, and 4) how compliance mandates get adapted to fit existing practices.
Literature Review
Regulatory Approaches to Data
The European Union Approach to Data
The European Union subscribes to a comprehensive approach to data that treats privacy
as a fundamental human right. This comprehensive approach to data regulation is commonly
referred to as data protection. Data protection regimes like GDPR are characterized by omnibus
laws, the existence of a single, centralized enforcement authority, and ‘binary governance’
models that balance individual rights to due process with industry accountability and
collaborative governance (Bamberger & Mulligan, 2015; Kaminski, 2019). Unlike in the United
States, these omnibus laws apply equally across both government and the private sector.
Although data protection regimes existed prior to the GDPR, the EU’s proactive approach to data
CHAPTER 2: TRANSLATING DATA PROTECTION INTO PRACTICE
8
regulation and information governance has promulgated around the world laying the foundation
for global market norms and data trade around the world (Hartzog & Richards, 2020).
According to Swire & Kennedy-Mayo (2020) the European Union’s comprehensive
approach to data came about due to three reasons. First, there was recognition that additional
safeguards were needed to protect privacy and ensure past injustices could not be repeated.
Second, due to the number of diverse countries coming together under the umbrella of the
European Union, minimum standards were essential for creating a shared system of meaning to
account for differences in culture, language, history, government, etc. A comprehensive
approach was vital for establishing consistency and streamlining privacy laws across member
states to prevent disruption in trade and continue the movement of data. Third, a consistent set of
laws and standards was vital for promoting public trust and engagement in electronic commerce.
Streamlined rules and procedures established a stable foundation that provided assurances to
users that their data would be safe by outlining expectations and behavior in practice (May, Chen,
and Wen, 2004; Swire & Kennedy-Mayo, 2020).
The United States Approach to Data
Compared to the European Union, the United States subscribes to a sectoral approach to
data regulation that treats privacy as a consumer protection. This approach is commonly known
as data privacy. The US model is characterized by bifurcation or separate regulatory frameworks
for overseeing government and the private sector (Richards, Serwin, and Blake, 2022). This
means government and private companies are often held to different standards when it comes to
individual rights and protections guaranteed by law. Within this model, rules vary by industry
and protections do not necessarily carry over or apply equally across the public and private
sectors. Unlike the EU whose basic data privacy rights apply regardless of the situation, the
CHAPTER 2: TRANSLATING DATA PROTECTION INTO PRACTICE
9
United States is a little more complex taking into account the context of when, where, and how
data was collected. Rather than having one omnibus, federal privacy law, the US approach has
multiple piecemeal regulations that are enforced by government agencies. Compared to the EU’s
comprehensive, top-down rule-making structure, the US sectoral system constructs privacy in a
more distributed, horizontal manner. Hartzog & Richards (2020) detail the multiple elements
responsible for shaping privacy law in the United States including the FTC as the “de facto
regulator of privacy in the United States,” state attorneys general as regulators and norm
entrepreneurs, and privacy lawyers and technology designers providing privacy on the ground (p.
1698).
Scholars argue this approach largely grew out of concern for government overreach and
abuse with US legislation providing citizens “greater protection against the collection and use of
personal information by government, as opposed to the private sector” (Levin and Nicholson,
2005, p. 362).
The California Consumer Privacy Act. The US federal government’s slow
development of data privacy law and policy has opened the door for GDPR principles and values
to permeate and absorb into American consciousness and law. The most high-profile example of
the GDPR shaping US policy would be the state-level California Consumer Privacy Act (CCPA).
Although the GDPR and CCPA share many similarities, scholars caution American audiences
not to conflate the two pieces of legislation noting important differences in the construction of
EU comprehensive data protection laws and the overarching legal and regulatory systems.
These factors complicate the translation process when laws are imported to countries like the
United States. Hoofnagle et al. (2019) pointed out the GDPR follows more of a ‘principles-based’
form of regulation that includes more vague language, ’aspirational’ principles, and
CHAPTER 2: TRANSLATING DATA PROTECTION INTO PRACTICE
10
‘indeterminate recitals’ that “flummoxes U.S. lawyers because of its lack of specificity” (p. 67).
According to Hoofnagle et al. (2019) this lack of specificity often carries over into regulator’s
expectations around compliance and enforcement. Rather than fixating on ‘perfect compliance,’
EU regulators typically have more informal, ongoing interactions and engagement with
companies outside of formal enforcement actions compared to the United States. Enforcement
tends to adopt a more lenient stance that reserves large judgments and fines for serious abuses
and negligent behavior rather than simple mistakes or noncompliance (Hoofnagle et al., 2019).
As a whole, “the EU’s omnibus approach to data protection is based on individual rights over
data, detailed rules, a default prohibition on data processing, and a zealous adherence to the fair
information practices (FIPs). In contrast, the patchwork approach of the United States is more
permissive, indeterminate, and based upon people’s vulnerabilities in their commercial
relationship with companies” (Hartzog and Richards, 2020, p. 1690).
The decision to define and codify laws in terms of data privacy versus data protection has
had major implications for how each system developed and evolved over the years. Each system
is significantly different in terms of its conception and approach to privacy and data regulation.
This raised questions about how these laws would converge and diverge in practice. Given the
United States distinct history, state variation in regulation, and unique infrastructure it was
important to understand how comprehensive data privacy mandates were being translated into
practice within the United States. Study 1 focused specifically on the impact of the regulatory
environment on privacy by documenting the experiences of privacy practitioners at SMEs as
they navigated the changing regulatory environment in the United States. Research questions
were as follows:
CHAPTER 2: TRANSLATING DATA PROTECTION INTO PRACTICE
11
RQ1: How did practitioners at SMEs in the United States perceive the regulatory
landscape?
RQ2: What aspects of data privacy laws and mandates were important in practice?
RQ3: How did SMEs respond to the changing regulatory environment in the United
States?
Methods
Data Collection
Interviews
A total of 18 interviews were conducted between February 2020 and February 2022.
1
In-
depth, semi-structured interviews were conducted to gather and probe individual practitioners’
knowledge, motivations, perceptions, and opinions about 1) the general trends and challenges
companies faced regarding data privacy and security compliance, 2) how the industry as a whole
was evolving in response, and 3) the interviewee’s approach to working in this space.
Interviewees included a range of experts from across law, privacy, and security fields.
Interviews ranged from 30-90 minutes with the majority of interviews lasting 60 minutes. Due to
the pandemic, most interviews were conducted virtually either by phone or video conferencing
platform (e.g. Zoom). Most were audio recorded and transcribed verbatim with the consent of the
interviewee. In cases where interviewees did not wish to be recorded, the interviewer took
detailed notes of the interview with the blessing of the interviewee.
Participants. Interviewees included a range of experts from across law, privacy, and
security fields. Participants included legal experts (e.g. privacy lawyers), privacy professionals
(e.g. chief privacy officers, privacy consultants), and security experts (e.g. information security
1
The Covid-19 pandemic created challenges in data collection and subsequently required
additional time to gather data.
CHAPTER 2: TRANSLATING DATA PROTECTION INTO PRACTICE
12
experts, chief information officers, white-hat hackers). The majority of interviewees were based
on the West Coast in California while a quarter of interviewees were based on the East Coast
(Washington DC, New York, and Florida), and two based internationally (Australia and
Canada).
2
Interviewees either worked for or represented companies from various sectors
including education, healthcare, business, and advertising to name a few. Interviewee
demographics including domain, employment sector, job title, and years of experience can be
found in Appendix 1. Questions for semi-structured interviews focused on understanding each
interviewee’s role within the data ecosystem, philosophy towards data, the impact of shifting
environments (e.g. regulatory changes, threats, the pandemic, etc.), their company’s/employer’s
approach to data use and protection, general challenges facing the industry as a whole, and their
strategies for managing risk and navigating uncertainty related to issues they identified.
Recruitment. Because pandemic social distancing measures impacted in-person
meetings, most recruitment took place online. Recruitment began by reaching out to privacy and
security experts (e.g. Chief Information officers, Chief Privacy Officers, Chief Information
Security Officers, privacy and security researchers, etc.) at prominent universities to identify and
recruit relevant participants, while also using snowball sampling. In addition, I attended data
protection webinars, virtual meet-ups, and online seminars from organizations such as
International Association of Privacy Professionals (IAPP) and Data Protection World Forum,
where I met and scheduled interviews with data protection practitioners. In order to gain a fuller
understanding of the current privacy landscape and policy implications, I joined IAPP-run web
forums/listservs and consulted industry blogs, news, and industry reports. I also obtained my
2
Due to Covid-19 lockdowns and restrictions, some participants happened to be working
remotely outside the United States at the time of their interview. Despite physical location, all
participants’ knowledge, expertise, and normal job functions involved working with companies
in the US context.
CHAPTER 2: TRANSLATING DATA PROTECTION INTO PRACTICE
13
Certified Information Privacy Professional (CIPP) certification from the world’s leading
information privacy organization, IAPP, to better understand the privacy landscape from a
practitioner perspective.
3
Sample. The sample differed significantly from other studies in existing literature due to
its focus on participants at small to medium-sized organizations. While studies by Bamberger &
Mulligan (2015) and Waldman (2018; 2020) focused on gathering perspectives from the top
echelon of businesses in the world, this study was interested in exploring how practitioners at
non-Fortune 500 or Big Tech companies were adapting to data mandates. By focusing on
practitioners working with or at small to medium-sized enterprises (SMEs), this research
highlighted how the data privacy landscape differed within the private sector. While interviews
with practitioners at Big Tech companies are useful, they represent a very small percentage of
companies around the world. The access, resources, scrutiny, competitive environment,
organizational structure, and business strategies (among other things) are so unique for these top-
tier organizations that it would be a mistake to generalize this as the norm for all companies
within the private sector. This study added to the conversation by expanding privacy literature to
include practitioners working with SMEs as they navigated the privacy landscape without many
of the resources and competitive advantages afforded to practitioners working at the most elite,
well-financed businesses in the world.
Privacy Practitioners. One reason why this study used the term ‘privacy practitioners’
instead of ‘chief privacy officer (CPO)’ was to account for differences in structure between
3
The IAPP’s CIPP certification provides accreditation to the world’s top privacy and
data protection experts in both law and private industry. Achieving CIPP certification requires
passing a rigorous exam to demonstrate mastery of jurisdictional laws, regulations, enforcement
models, and legal requirements for handling and transferring data. Additionally, maintaining
certification requires 20 hours of Continuing Privacy Education (CPE) per term.
CHAPTER 2: TRANSLATING DATA PROTECTION INTO PRACTICE
14
SMEs and large corporations. Many Fortune 500 companies now have dedicated chief privacy
officer positions that lead entire departments devoted to privacy and data protection. However,
this was not necessarily common for most SMEs. Data privacy and protection staff at SMEs in
the United States could range from no full-time staff (in which the company would hire outside
consultants to build their compliance programs), to having one or two staff members (e.g. the
chief privacy officer and their assistant or others in related positions), or small teams (usually no
more than three to five people at most). To set inclusion criteria to only include chief privacy
officers or companies with established departments would completely ignore the experiences of
an entire group of practitioners within the private sector. The strategic decision was made to refer
to the sample as “privacy practitioners” as it opened up a wider range of perspectives and
experiences that would not have otherwise been included.
Qualitative Analysis
Because the goal of the study was to explore how privacy practitioners navigated the
regulatory environment in the United States, an inductive bottom-up approach was used to code
study data. Inductive coding approaches are “data-driven” meaning they begin the analytic
process from the “bottom” (i.e. the data itself). The data within the study’s corpus of data
initially guided analysis and development of themes (as opposed to being driven by theory).
Theory-driven approaches are distinguished from data-driven approaches by their interest in
applying specific analytical or conceptual lenses, which focus attention on a particular aspect of
the data. According to Braun and Clarke (2006) inductive analyses usually involve “coding the
data without trying to fit it into a pre-existing coding frame or the researcher’s analytic
preconceptions” (p. 83). For Study 1 there was no predetermined codes or ideas about the data
driving analysis. The process of finding meaning and generating themes from the dataset was
CHAPTER 2: TRANSLATING DATA PROTECTION INTO PRACTICE
15
done inductively without outside influence from previous literature (Auerbach & Silverstein,
2003). The resulting research themes reflected “patterned” responses or shared meaning within
the dataset. These themes extended understanding of the privacy landscape by capturing “the
essence and spread of meaning [uniting] data that might [have] otherwise appear[ed] disparate,
or meaning that occur[ed] in multiple and varied contexts […it represented] abstract entities or
ideas, capturing implicit ideas ‘beneath the surface’ of the data, but [sic] also capture more
explicit and concrete meaning” (Braun, Clarke, Hayfield, and Terry, 2019, p. 845).
Using Miles, Huberman, and Saldana’s (2014) two-cycle coding, the first cycle utilized
initial and in vivo coding to explore the data. This process generated codes that emerged
organically from the interview transcripts. Constant comparative analysis was used throughout
the data analysis process by examining new codes in relation to old codes. Data was constantly
being compared with new codes being regularly analyzed, updated, and redefined in relation to
old codes. As Thornberg and Charmaz (2014) note, this process compares “data with data, data
with code, and code with code, to find similarities and differences” (p. 158). From here, the
second cycle focused on grouping codes into categories based on themes. This process of
analyzing similar content from across interview data eventually resulted in the generation of
common themes. The goal was to find the most salient categories within a corpus of data, which
requires the researcher to make “decisions about which initial codes make the most analytic
sense” (Charmaz, 2006, p. 57). A conceptual map of themes can be found in Figure 1.
CHAPTER 2: TRANSLATING DATA PROTECTION INTO PRACTICE
16
Figure 1
Study 1 Themes
Results and Discussion
The emphasis on the interactions taking place within SMEs highlighted how decisions
around privacy got negotiated in practice which served to illuminate previously overlooked
problem areas and suggest alternative approaches for regulating and addressing gaps in these
systems. Findings from interviews provided insights into how laws were operationalized within
SMEs and the resistance encountered by practitioners as they attempted to balance the
implementation of numerous data mandates while maintaining normal business operations.
CHAPTER 2: TRANSLATING DATA PROTECTION INTO PRACTICE
17
Results highlighted how ambiguity and unpredictability in the regulatory environment
contributed to internal resistance against practitioners and organizational change in the
workplace. Unstable regulatory environments reinforced existing organizational realities that
clashed with updated conceptions of data privacy and compliance. In particular, interviewees
described how ambiguity in data privacy laws became points of contention between departments
as each stakeholder group came to the table with their own values, priorities, interpretations, and
understandings of what laws meant and their subsequent responsibilities. These multiple
competing interpretations influenced not only how individual departments conceptualized and
approached privacy but also shaped the organization’s overarching conception and approach to
privacy as well. Leadership buy-in and management mentality toward data were key factors that
contributed to how organizational responses to privacy mandates developed and played out.
Interviewees provided examples of how key accountability measures like data inventories and
data retention were often shortcut in practice and described how framing privacy compliance in
terms of risk could help motivate action and mobilize resources toward change. Study findings
not only aligned with and supported existing research on organizational change theory but also
added vital context into how these issues played out on the ground.
Unstable, Unclear, and Conflicting Regulatory Environments
Participants described regulatory environments in the United States as complex for three
reasons including 1) misconceptions around privacy expectations, 2) unstable regulatory
environments, and 3) difficulty deciphering laws in practice.
Managing Misconceptions and Expectations Around Privacy
Practitioners described how growing awareness among consumers regarding the GDPR
and CCPA protections could lead to misconceptions and misconstrued expectations of data rights
CHAPTER 2: TRANSLATING DATA PROTECTION INTO PRACTICE
18
in the United States that do not actually exist. While certain companies, particularly
multinational corporations, might have set a precedent by choosing to extend protections to all
users regardless of location, it did not mean those rights actually existed or were uniform across
jurisdictions. PRIVACY_10, a privacy consultant specializing in global compliance and privacy
management with over 25 years experience, gave the example of Microsoft implementing a
process to respond to the GDPR data subject requests (DSARs) from around the world rather
than just the European Union.
Do you know where most of the data subject access requests ultimately came from? The
United States. GDPR basically spammed everybody on the planet in 2018 saying ‘you
have rights, can we continue writing to you?’ So what did it do? It educated everybody. It
was a communication strategy that by mistake made everyone think they had the same
rights. Now it’s created a consumer expectation that is actually irrelevant as to the
jurisdiction you’re living in. It globalized the European privacy principles. It’s now
everyone’s expectation. –PRIVACY_10
This expectation put practitioners at SMEs in an interesting position. Although the
growth of privacy rights expectations among the public and legislators can be seen as start of
market building that might eventually create enough demand for more robust protections, it
should not be confused with the implementation of actual protections. Part of the challenge for
practitioners at SMEs was helping their organization walk the fine line between upholding
consumer expectations of privacy while not adding additional liability or becoming avoidant of
responsibility. Promising too much could open up problems if the organization could live up to
its own assurances or policies, resulting in anything from a branding backfire, consumer mistrust,
or even an FTC investigation.
According to interviewees, creating a uniform strategy for SMEs in the United States was
quite difficult due to varying rules within a sectoral system. Compared to SMEs, multinational
corporations had enormous means and entire teams that navigated these challenges, allowing
them to meet compliance standards and consumer expectations easily without a uniform federal
CHAPTER 2: TRANSLATING DATA PROTECTION INTO PRACTICE
19
privacy bill. This was a very different story for SMEs, many of whom struggled to reconcile the
plethora of similar, yet conflicting state laws. PRIVACY_3, a partner attorney at a California law
firm specializing in cyber law, data security, and privacy over a decade of experience, gave the
example of data breach notification laws when explaining why a federal privacy law would be
helpful, particularly for SMEs, to harmonize the multitude of state bills:
Every state has their own data breach notification laws, they’re substantially similar but
there’s enough nuance and differences in between them that you have to go through each
one of them and figure out how to comply. Some of them say you have to tell what
happened and some of them say you can’t tell what happened. Some of them say you have
to do an assessment to see if it’s likely that there’s going to be an impact to somebody.
Some of them just say report or notify anyway.
Those little differences from a legal perspective are really hard because your goal
is to be precise and when there’s a little bit of a difference in what is personal
information or what it breaches or what is a violation of a particular law, you’re going to
try to want to cover for everything to make it simpler for your business practices to do
that, it’s just going to get really hard. We’re seeing kind of two forms of privacy laws,
we’re seeing CCPA copies and we’re seeing GDPR copies or some hybrid of the two and
they’re not exactly overlapping and so you’re going to be required to do some things in
some states versus not in others, so it will be interesting. That will be a challenge. –
PRIVACY_3
According to the International Association of Privacy Professionals (IAPP), as of June
2022 five states (California, Colorado, Connecticut, Virginia, and Utah) have signed a
comprehensive privacy bill (Lively, US state privacy legislation tracker, 2022). Maine and
Nevada each have state privacy bills but they are not necessarily comprehensive (Rippy, 2021).
Layered on top of this are international regulations such as the EU (GDPR), the United Kingdom
(UK GDPR), Canada (Personal Information Protection and Electronic Documents Act), Australia
(Privacy Act 1988, Australian Privacy Principles, and Australia Privacy Principles Guidelines),
and China (Personal Information Protection Law) that can also indirectly affect private sector
companies in the United States (The Westin Research Center, 2021).
CHAPTER 2: TRANSLATING DATA PROTECTION INTO PRACTICE
20
Unstable Regulatory Environments
The introduction of comprehensive data privacy mandates in 2018 marked the beginning
of a new era of regulatory activity around the world. Participants described this regulatory
environment as dynamic, uncertain, and constantly evolving due to new state, sector, and global
regulations being passed and clarified on an unpredictable basis. The quick passage of the CCPA
left many practitioners confused and scrambling to keep up with the myriad of developments,
proposed changes, and revised deadlines. PRIVACY_8, a partner attorney specializing in privacy
and data security at a national law firm, described what this environment looked like and why it
posed such a challenge for practitioners on the ground.
There has not been any certainty. There has not been any stability over what this law
ultimately looks like and that remains true right now [July 2020]. Yes the statute passed
in June 2018 but it was amended a couple months later and then we had this whole
experience of drafting regulations and rule makings that occurred for the past year and
those are still not finally approved. Then we’re faced with the prospect of a ballot
measure and then the enforcement deadline was delayed.
So when a company is trying to figure what they need to do and when that answer
has not been incredibly easy to come by there’s a lot of uncertainty and it puts lawyers, in
particular, in a position of having to not be certain which is what people hate about
lawyers. We always say ‘it depends’ or ‘this is defensible’ or whatever and there’s no
kind of ultimate certainty around this and that’s still the case. –PRIVACY_8
This example provided a glimpse into the perceived ‘regulatory limbo’ practitioners
experience as they navigate these laws in practice. Since the interview in July 2020, there have
been major developments such as the passage of the California Privacy Rights Act (CPRA)
ballot measure, which is scheduled to take effect in January 2023. Despite some added clarity in
the laws themselves, there are still many questions around the regulation and enforcement of this
data that have yet to be answered.
CHAPTER 2: TRANSLATING DATA PROTECTION INTO PRACTICE
21
Deciphering What Laws Mean in Practice
According to interviewees, a major challenge for SMEs was deciphering which mandates
applied to their organization and how to put them into practice. Part of the confusion stemmed
from the CCPA being carefully crafted not to conflict with any existing privacy rules such as the
Health Insurance Portability and Accountability Act (HIPAA) or Gramm–Leach–Bliley Act
(GLBA) which opened up new layers of complexity by introducing regulatory gaps and grey
zones. Navigating this environment involved parsing many potentially applicable, but often
conflicting, mandates and then deciding which regulations to prioritize and orient company
policy around. In this case, even though the CCPA applied across multiple sectors in California,
there was still a lack of uniform standards or benchmarks that companies could aim toward and
certify against for compliance. PRIVACY_6, a privacy consultant with over 20 years experience
working in the field, explained:
I think that the challenge there is in the absence of a federal regulation and in the
prevalence of sector-based regulation there isn’t a single standard that companies can
certify against in order to be able to demonstrate that even an external auditing authority
confirmed their compliance. So we’ve got ISSA [Information Systems Security
Association] certification, you’ve got the ability to build out NIST [National Institute of
Standards and Technology] frameworks, you can get SOC 2 [Service Organization
Control 2] and 1 and 3 for that matter. There are certain information security standards
out there that are national and international in nature but I think that what we would
need is probably another standard. I think that in a lot of ways, ISO 27701- the privacy
regulation that added on to ISO 27001 [ISO/IEC 27001 is an international standard on
how to manage information security] attempted to go in that direction but it’s not really
mapped correctly to privacy. It’s like a security standard they tried to co-opt into privacy
as opposed to doing it on its own merits but I think if we had some degree sort of degree
of certification but at the end of the day quite frankly with data it’s not possible to
guarantee and people can lie. –PRIVACY_6
The lack of consistent standards within the patchwork of existing legal frameworks
created confusion around what laws data was subject to and how organizations were expected to
respond. This problem was especially pronounced for companies coming from industries without
previous experience being regulated, with many organizations struggling to meet basic
CHAPTER 2: TRANSLATING DATA PROTECTION INTO PRACTICE
22
obligations under the new provisions. Most companies were starting from scratch in terms of
developing new processes, practices, structures, and logistics to support updated definitions of
privacy compliance and accountability. Participants stressed that the solution was not just about
keeping better track of data but changing the entire culture around data. Over half of the
interviewees with security backgrounds noted that many companies do not have the knowledge,
experience, capability, or infrastructure in place to adequately protect the extremely sensitive
(and highly valuable) data they are collecting and making accessible to businesses. However,
even if policies were in place there was no guarantee they will look the same in practice.
New Technology Pushing the Boundaries of Sectoral Systems. Another reason
participants described the regulatory environment as complex was the steady stream of new
technologies that continually pushed the boundaries of sectoral laws and systems of practice. The
increasing digitization of many products and services meant that highly specific data that used to
only exist within certain contexts could now be combined, shared, and moved across multiple
sectors, creating confusion about what responsibilities and obligations organizations had when
handling certain information. One example that came up across six interviews was telehealth and
wellness data.
Telehealth. Participants pointed out that while many people assume telehealth is covered
by HIPAA that is not necessarily the case. According to the US Health Resources Services
Administration (2019), “telehealth is different from telemedicine because it refers to a broader
scope of remote healthcare services than telemedicine. While telemedicine refers specifically to
remote clinical services, telehealth can refer to remote non-clinical services, such as provider
training, administrative meetings, and continuing medical education, in addition to clinical
services” (Para 2). This seemingly minor distinction can have major implications for when and
CHAPTER 2: TRANSLATING DATA PROTECTION INTO PRACTICE
23
how HIPAA applies to data. PRIVACY_8 outlined some misconceptions many non-healthcare
companies have when entering into this space.
The area to focus on is the healthcare adjacent companies. [With] telehealth, you have
these different kinds of apps or tech-type services, sharing economies where they are
increasingly offering healthcare models. Wellness is often not necessarily healthcare but
its health adjacent and it might involve healthcare data. I think for those kinds of
companies, they have to look a lot more closely at it because they may not understand
that they can still be under HIPAA or they may be avoiding HIPAA.
For example, take all these tech solutions that are being developed for contact
tracing for Covid. If you have an app that requires the consumer to upload their
diagnosis or their test results, that information is not regulated by HIPAA because if the
consumer is just voluntarily providing it, it’s not coming straight from the medical
provider. If the employer is getting access to that data…it’s not HIPAA regulated. It’s
thorny but it seems like it is. That’s an example where you really have to look at the
context in which it was collected and what it’s being used for. –PRIVACY_8
For companies in sectors with a history of regulation, navigating regulatory grey zones
was easier because these companies were used to documenting processes and tracking data.
However, for many other businesses outside of these sectors the mandates represent a new and
confusing array of requirements that are difficult to translate into practice.
Policies on Paper ≠ Policies in Practice
When talking about the implementation of the CCPA, participants noted certain aspects
of laws were especially problematic to translate into practice. They stressed that laws that made
sense on paper could not necessarily be easily implemented in practice. One reason for this was
the existence of competing realities or multiple sets of shared meaning systems within
organizations that influenced how stakeholders within companies interpreted and responded to
data compliance mandates.
Competing Organizational Realities
According to participants, not every department saw eye-to-eye when it came to using
data within an organization. There are often multiple “camps” that advocate for certain positions
CHAPTER 2: TRANSLATING DATA PROTECTION INTO PRACTICE
24
and actions that aligned with their particular goals and interests. For example, marketing
departments tend to push for more flexibility and wider access to data whereas IT and security
preferred to limit access as much as possible. This aligned with a concept in organizational
theory known as organizational reality (Owen and Dietz, 2012).
Organizational realities reflected the shared standpoints and perspectives of particular
groups within a company and included collective beliefs, values, priorities, and objectives. A key
aspect was shared meaning or “the degree to which the organization is unified around a shared
set of meanings about beliefs, experiences, actions, and results” (Owen & Dietz, 2012, p. 8).
According to Owen & Dietz (2012), “when an organization is able to align members, processes,
systems, and aspirations around a sense of shared meanings about what is important and
worthwhile, it enables its members to fulfill important psychological needs for purpose, and this,
in turn, creates high levels of commitment and motivation” (p. 8).
However, study participants noted that when it came to data, there was not a single set of
shared meanings within their organizations. Instead, there were multiple competing perspectives
that certain groups or departments tended to coalesce around. These organizational realities
shaped how stakeholders perceived situations by supporting and reinforcing certain
interpretations of definitions that aligned with their particular point of view. Practitioners noted
that for them, success depended on their ability to navigate this divide and persuade other
stakeholders within the organization to adopt their organizational reality and approach to data.
This finding was consistent with organizational theory scholarship which noted that “shared
meaning is not given, however, for it requires members to develop and assign a particular and
specific meaning to information, and to develop a shared grasp of its significance or implications
for their own as well as others’ behavior. At issue then is how organizations go about creating
CHAPTER 2: TRANSLATING DATA PROTECTION INTO PRACTICE
25
shared meaning, especially during times of needed change, and how they ‘live’ this meaning on a
day-to-day basis” (Owen & Dietz, 2012, p. 8). Interviewees were unanimous that one factor that
significantly complicated this internal negotiation process was imprecise definitions within data
privacy laws.
Imprecise Definitions. When it came to internal implementation of mandates,
participants noted that vague or imprecise definitions within the laws themselves could
significantly complicate the translation process by making it harder to reconcile competing
organizational realities around data within companies. These findings aligned with research by
Waldman (2020) that argued the ability to interpret definitions and standards within laws leads to
ambiguity and complex challenges in practice. As Waldman (2020) noted, “whenever a new law
comes into effect, decision makers on the ground are empowered to adapt, consider changes in
context, and assess what is best under the circumstances, especially before a court of law has a
chance to have its say. But vague laws can be particularly problematic. The ‘staggeringly
complex’ and ‘ambiguous’ GDPR lays out several of these broad standards that need to be given
‘specific substance over time.’ Until that happens, regulated companies have room to determine
what the law means” (pp. 793–794). Results from this study suggested that a similar negotiation
process occurred internally within organizations as certain departments and organizational
realities clashed around how to respond to data privacy laws.
Vaguely defined terms in legislation provided little guidance for how or why a single
definition should be adopted over others. This opened the door for definitions to be molded and
adapted based on the situation or priority at hand. It is important to note that this problem is not
simply a result of lack of clarity in rules but a naturally occurring feature within any regulatory
system. Even with clear definitions, practitioners would still have to contend with multiple
CHAPTER 2: TRANSLATING DATA PROTECTION INTO PRACTICE
26
interpretations of rules (albeit to a lesser extent) because there are different interests and
priorities at play. Because the European Union treats privacy as a fundamental human right, there
are fewer opportunities than in the United States to dispute legal definitions or offer alternative
interpretations that diverge from the law’s intended meanings. However, there will always be
some level of negotiation required around laws because multiple interests are coming together
and interacting with one another. As participants pointed out with deidentified and anonymized
data, vague legal definitions compounded this problem but were not solely responsible for
creating it.
Deidentified and Anonymized Data. Five interviewees brought up deidentified and
anonymous data as examples of how a lack of uniform standards and variance in definitions
could open the door for companies to decide themselves what constituted appropriate protections.
PRIVACY_3 explained how companies could choose to interpret provisions as they saw fit in
the early days of CCPA:
[Deidentified data] is based on a definition and a definition is subject to interpretation.
So some people have interpreted that- especially deidentified in the CCPA- to say ‘as
long as I don’t have it associated with a real world identifier (which would be like an
email address, a name, or something like that) it’s deidentified.’ I don’t think that’s the
case and I think that you would have an argument from the Attorney General or even any
sort of plaintiff’s attorney that says ‘that’s not deidentified, someone can identify that
person.’ Plus you have to have the controls in place and you have to prove that you’re
actually not connecting those data points. There’s work there. And anonymized data I
think is a little bit harder because again all of this stuff is what’s capable technologically?
If that changes, does my obligation change? Or does it require me to not actively do this
stuff versus is someone capable of doing it? –PRIVACY_3
The question of what it meant for data to be deidentified or anonymized is still largely
unsettled and cannot be assumed in practice. Participants as well as academic researchers pointed
out that certain types of data like location were particularly problematic from a privacy
standpoint, due to its highly personal nature that made it almost impossible to have usable but
not re-identifiable data (de Montjoye, Hidalgo, Verleysen, & Blondel, 2013; Farzanehfar,
CHAPTER 2: TRANSLATING DATA PROTECTION INTO PRACTICE
27
Houssiau, & de Montjoye, 2021). Three participants with technical and security backgrounds
provided insights into the logistical considerations they faced trying to verify whether third-party
companies they worked with were handling anonymized and deidentified data properly.
DATA_4, a security expert specializing in data governance and risk management, explained:
You want to pull at the string of how do you anonymize, how are you deidentifying, where
does the original data come from, are you holding it or is it coming to you already
deidentified? We start to try to lift up the rocks just to make sure because there’s some
people who really don’t know what they’re doing with deidentified data. They’ll have the
master dataset then they’ll change a few fields and think they’re good but really anybody
could figure out how to reverse engineer and get back to the original file links. –DATA_4
The lack of uniform standards and clarity in definitions meant practitioners could not take
third-party assurances at face value. This meant each party was largely responsible for deciding
what provisions meant and how they were implemented. Unless there was a concerted effort to
engage the third party about specific governance practices, there could be considerable confusion
between stakeholders regarding actual data controls and protections in place. Practitioners
described taking on heavy responsibility for establishing shared meaning with third parties by
communicating expectations through contracts, legal agreements, and other processes. Over 75%
of interviews discussed the difficulties they faced balancing multiple perspectives and
interpretations when deciphering the laws. PRIVACY_3 described the many ‘hats’ practitioners
must wear as they considered how to integrate rules in practice. Speaking about deidentified data,
he highlighted how definitions can change and misalignment can occur between theory and
practice.
CHAPTER 2: TRANSLATING DATA PROTECTION INTO PRACTICE
28
If you’re looking at it from a privacy rights perspective like if you’re drafting a law, the
definition is clear to you. Looking at it from a potential liability perspective, you’re never
clear until there’s some sort of ruling on this. Is this sufficient? Because ultimately it’s a
battle of what someone says it is and then a judge decides what that meaning is, ‘is this
sufficient in this case or not?’ Sometimes it is a fact-specific scenario. It could be that
there’s a minimum baseline of what’s deidentified or not but we haven’t established that
within the context of the CCPA so the fact that there’s not any rulings on this, there’s a
lot of uncertainty for businesses. Is this going to be sufficient or not? Can I rely on that to
say we’re good or not? That’s why there’s uncertainty because we don’t have a legal
confirmation of what deidentified means other than to interpret the law. –PRIVACY_3
While California has since clarified some of these provisions, there will always be some
level of uncertainty because definitions are subject to interpretation. As the quotation above
highlights, it is not uncommon for certain terms and approaches to be prioritized over others
eventually evolving and diverging from their original intended meaning to something very
different in practice.
When asked what steps it would take to ensure that data could not be re-identified,
participants explained that the process was much more subjective and less quantifiable in
practice. In situations with very few standards or guidelines, the lengths to which companies go
to protect user data often came down to internal motivations. Within organizations, this decision
was a result of negotiations between multiple stakeholder groups and departments about how to
interpret definitions in a way that reflected and aligned with the organization’s dominant reality
and objectives. PRIVACY_2, a chief privacy officer at a major research institution, explained
how anonymized data could be interpreted differently within the same organization depending on
who was asked.
CHAPTER 2: TRANSLATING DATA PROTECTION INTO PRACTICE
29
I don’t think data can be anonymized and still be useful. Certainly you can go completely
abstract and have no nuance but that’s not really helpful. I think that the way to think
about that, or at least the way that I think about it, is to think about the viewpoint of the
person speaking. You know when they say ‘where you stand depends on where you sit,’
have you heard that expression before? So looking at what is this person’s goal, what is
this person’s objective, you will quickly notice that different groups have different types
of risk in mind. So if I’m general counsel for somebody or if I’m an internal auditor, my
job is to mitigate risks or handle risks to an entity and make sure that it continues to
move forward, do its business, meet its mission, whatever it is. Then you have other
people whose risk that they’re trying to handle is to individuals. I see my job as yes,
protecting the institution but it’s much more important to me anyway as a privacy
advocate to think about the risk to individuals. –PRIVACY_2
Different organizational realities often vie to be the dominant perspective around which
an organization orients policy and practices. As illustrated in the quotation above, stakeholders
had an incentive to try to push the boundaries of definitions to see if they can sway laws to work
in their favor. Some of these different organizational realities were seen within interviews when
participants talked about how they interpreted and responded to risk.
Onus of Responsibility. How privacy laws were interpreted within organizations had a
significant impact on how internal approaches to privacy could develop and take hold. Decisions
about how to proceed depended on whom the programs were designed to protect. A key struggle
for practitioners was figuring out where the onus of responsibility lay under the updated
compliance-based approach. Traditionally, the United States has placed individual users at the
center of legal infrastructures, meaning that the individual takes on the majority of responsibility
for protecting their own data privacy. However, the GDPR and CCPA attempted to shift some
responsibility for protecting individual user privacy to companies. How organizations interpreted
this shift in onus of responsibility influenced how they developed and implemented their internal
privacy approach for achieving compliance. Interviewees were split when it came to who should
bear the burden of responsibility, with many, but not all participants aligning with the dominant
realities of their profession. For those primarily worried about managing risk to an organization,
CHAPTER 2: TRANSLATING DATA PROTECTION INTO PRACTICE
30
the main goal of compliance was to avoid lawsuits or anything that could negatively impact the
company. Half of the interviewees believed the primary goal was to increase transparency so
individuals could provide informed consent. PRIVACY_11, a partner attorney specializing in
strategic privacy and cyber-preparedness compliance counseling at an international law firm,
described the privacy mentality of top leadership:
I think the most important thing companies can do is be transparent about what they’re
doing. I think then you have the accurate playing field of information so that knowing
choices by the consumers or individuals can be made. –PRIVACY_11
PRIVACY_3 provided additional insight.
What it really comes down to is just don’t volunteer personal information. That’s
a big thing. A lot of people, especially young people, don’t really understand it because
they think it’s all figured out and it really isn’t. So if they’re being asked for personal
information, they must think they’re required to give that up. –PRIVACY_3
When the priority was promoting individual responsibility, solutions tended to focus on
user education, transparency, and dispelling myths or misconceptions about data collection and
usage. On the other hand, when risk to an individual was considered to be an equally important
priority to managing risk to the organization, the decision calculus changed significantly. The
other half of the interviewees pushed back against this assertion, arguing that simply providing
transparency and more information to users would not help people make more informed
decisions. They argued that the landscape was already so complex and only getting more
complicated that it was unreasonable to expect consumers to be able to navigate it without major
changes. PRIVACY_2 and PRIVACY_8 shared their thoughts on the matter:
You’re asking an individual to figure out what the heck Google is doing across 20
different products that they provide. I don’t know and I’m in the business. It’s just like the
Mark Zuckerberg hearings where the senators said, ‘we read your privacy policy, what
the heck does that even say? That doesn’t tell me anything.’ It’s not fair to ask people to
do that on their own and make good informed decisions across platforms. There was
somebody that did research similar to that where they looked at privacy policies and they
said, ‘if I had to read the privacy policy of every place that I went it would take like 76
years or something like that.’ –PRIVACY_2
CHAPTER 2: TRANSLATING DATA PROTECTION INTO PRACTICE
31
Unless you’re talking to a specialized privacy attorney like myself, you’re just not
going to understand those nuances and I wouldn’t expect you to because it’s very
complicated. I have the benefit of [my company] having a Sacramento office and people
are on the floor in the state legislature reporting developments to me. Plus I’m able to
aggregate all of the knowledge I get from all of our clients so if you don’t have access to
that, you’re just not going to understand these nuances and you’re only going to do
things when you absolutely have to do them. –PRIVACY_8
For these interviewees, the goal when building privacy programs was to help consumers
achieve not just informed consent but informed engagement with companies holding their data.
In this case, solutions emphasized user education, choice, and shared responsibility within the
company-consumer relationship. Even within the privacy practitioner community there is
divergence in opinion about where the onus of responsibility lies. It is not simply the result of
imprecise language in legislation but competing organizational realities.
Regulatory Environments, Organizational Realities, and Privacy Implementation
As previously mentioned, imprecise definitions and ambiguity alone are not responsible
for the situation. However, the uncertain regulatory environment does make it significantly more
difficult for practitioners to carry out privacy work by creating conditions that make it harder to
overcome divergent realities within organizations.
Regulatory Limbo
Over half of the participants noted that unpredictable conditions made it difficult to
advance privacy within an organization. The uncertainty about what laws would get passed and
when mandates would go into effect made it difficult to develop, plan, and build out programs
within organizations since the “goalposts” could be moved at anytime. Feasibility was also an
issue because doing privacy right required enormous investment including time, money, and
ongoing planning, coordination, and collaboration within the company and with external partners.
Leadership was understandably skeptical about devoting large amounts of resources to build
privacy into everyday operations when mandates were still up in the air. Additionally, efforts to
CHAPTER 2: TRANSLATING DATA PROTECTION INTO PRACTICE
32
persuade management within the company to adopt more privacy protective realities were
hampered by the fact that organizations could engage in practices that were deemed compliant
without actually protecting data. PRIVACY_8 explained:
One aspect is you can have the proper policies, disclosures, and whatnot ironed out and
in a lot of ways that’s the easy thing to do. It’s much harder to actually ensure that
you’re looking in every corner of your company and grabbing all the data that you need
to. So part of it is that ‘yes, there’s confidence that we’re doing everything we have to do
that will protect us from a civil penalty.’ However, they may be less confident that they’re
actually fully complying with the law- if we’re all being honest. –PRIVACY_8
Part of the challenge when communicating internally about data privacy was its abstract
nature. Unlike security, threats to privacy were still not fully understood within most
organizational realities. This left many practitioners at SMEs in tight spots trying to justify
spending that would most likely change the status quo of the organization without offering many
tangible benefits in return. Rather than creating robust, long-term data governance strategies as
originally envisioned by the comprehensive data mandates, interviewees described many
organizations gravitating towards short-term, minimally intrusive, low commitment, ‘fixes’ that
allowed their organizations to demonstrate compliance without disrupting normal operations or
costing too much money.
Planning in an Uncertain Regulatory Environment
Practitioners’ understanding of their responsibility within the privacy chain combined
with their assessment of the regulatory environment influenced the type of solutions they
advocated for within their organization. Participants described two primary ways companies
responded to the uneven, patchwork system of privacy laws in the United States. The first was
proactive adoption of comprehensive privacy principles. This approach involved orienting
policies and practices around the strongest law, industry standard, or high-level goal the
organization set for itself. Companies benefited by staying ahead of policy developments and not
CHAPTER 2: TRANSLATING DATA PROTECTION INTO PRACTICE
33
having to worry about new legislation or falling outside of compliance rules. PRIVACY_6 gave
an example of how clients have dealt with this decision.
I had one client that was on the fence about opening up their own competing business
unit but it had been easier for years to outsource it to a vendor and they had two vendors
in that sector that could not get GDPR compliant. One of them was being difficult and the
other was just too small but it didn’t matter because the data was sensitive enough it
could have been a problem. That was the deciding factor for them they were like, ‘we
can’t outsource this anymore, to hell with it, we’ll build it up. Now we can justify the
cost.’ –PRIVACY_6
This approach was particularly important for organizations that did (or were looking to
do) business outside the United States in order to comply with global data sharing and transfer
standards. Participants noted that while some companies have opted for this approach, there were
a number of barriers that made this path far less feasible for many SMEs. This approach was
often perceived by leadership as unnecessarily expensive and potentially not worth the cost of
disrupting business operations. Additionally, such changes could hinder the company’s ability to
compete for clients if the company was not in a position to leverage privacy improvements.
On the other hand, other companies found this unstable and patchwork regulatory
environment discouraging and opted instead to do the bare minimum required to demonstrate
compliance. Practitioners explained that certain organizations (often those with less advanced
data privacy and information governance capabilities) were more tepid and risk averse in terms
of their organizational responses, preferring to “wait and see” how things turned out before
making any moves. In general, these companies were focused more domestically so international
standards around data only indirectly affected them. Leadership at these organizations tended to
focus more on the short-term risks involved with updating and overhauling their business
processes rather than the long-term risks of not being compliant with regulatory standards down
the road. PRIVACY_6 described working with clients who preferred the piecemeal route:
CHAPTER 2: TRANSLATING DATA PROTECTION INTO PRACTICE
34
So most companies right now have a privacy notice and a lot of them have taken the
stance of carving out and having a separate CCPA notice. So for whatever reason they
want two, but that’s more to update. I’m like ‘okay fine you can have two’ but what
happens when you have five, six, seven more states? My point is always, are you going to
have like five more links to your privacy notice, your California, your New Jersey, New
York, Texas, Nevada? I don’t know how much real estate you’ve got on your webpage.
But maybe by the time there’s enough practice in integrating all the different
requirements by necessity since this thing would become very unwieldy- maybe that
would also sort of naturally help it. It will go the other end, it won’t be compliant
because it will be so murky and maybe it will help come up with easier, more
straightforward approaches. –PRIVACY_6
Interviewees stressed that while it was common for companies to seek advice and look to
peer organizations for guidance on how to proceed, at the end of the day each organization had to
make its own decision regarding the costs and benefits of investing in privacy compliance. This
decision was dependent upon multiple factors such as leadership priorities, business pressures,
regulatory and public scrutiny, and industry standards.
Adapting Laws to Fit Practice
One issue discussed by participants was how organizations could cut corners while
demonstrating compliance. This was especially common when mandates were seen as just
another hoop to jump through rather than an integral aspect of the organization to be prioritized.
Speaking about steps companies should be taking to protect data moving forward, PRIVACY_6
shared her thoughts:
I think that business has always known that these kinds of things would be the proper way
to go about doing things, life, conducting internal business. But there is always a corner
to be cut because a to-do list that has a couple items too many and what you end up
finding, unsurprisingly but all too common, is that tasks and activities that cost the
business money –so if you’re not building the product but assessing it– you’re that much
farther from turning it into a revenue generator. Those kinds of cost center type activities
tend to get cut or they tend to get skimped on. –PRIVACY_6
According to interviewees, some of the slow progress could be attributed to laws being
adapted to fit practice rather than the other way around. Part of this was due to leadership
mentalities that did not treat privacy as a priority and instead stuck with approaches that
CHAPTER 2: TRANSLATING DATA PROTECTION INTO PRACTICE
35
continued to treat privacy as a distant set of actions that only applied in certain situations. Failing
to prioritize data privacy often lead to sub-par, haphazard practices, and disorganized policies
that could rarely be enforced. Aside from a few specific instances like approving budgets or
responding to major crises in the news, privacy did not factor into everyday business decision-
making. Trying to push new organizational realities that valued individual privacy onto apathetic
leadership and the organization could risk alienating privacy practitioners from the rest of the
company. Interviewees mentioned leadership was supportive when problems involved easy fixes
or if the issues were egregiously bad. However, outside of this, there was not much interest in
making privacy a core part of the business since leaders saw little value in going above and
beyond the minimum standards without more incentives. PRIVACY_10 summarized this
mentality:
We’re dealing with systems that are 20+ years old and who’s got the budget to fix them?
If they don’t do it at the time, which requires a level of discipline and governance to build
what’s called privacy-by-design. So privacy-by-design is less expensive because rather
than reengineering things later or even facing the prospect that legal will shut you down
completely before you can even launch, you build it in. Incrementally if you do this with
every system, eventually the organization will be great right? Why aren’t we there yet?
Because we haven’t done it. We haven’t introduced that discipline because everyone’s
going ‘yea that’s an additional cost.’ Well yea but it’s a lot less expensive than the
breach or the fine or trying to retrofit privacy later. –PRIVACY_10
While comprehensive approaches built with privacy-by-design principles baked in might
work for companies like Apple, it is a hard sell for SMEs with fewer resources and different
business models. Many companies already have established systems and practices in place and
might not be able or willing to overhaul their business models and infrastructures without a
tangible incentive or threat of reprisal. Further, embracing privacy as a part of a business’ brand
identity is not only difficult to achieve in practice but can backfire if companies do not follow
through on their promises, leading to even more mistrust and scrutiny (Brough, Norton, & John,
2020). Participants noted that even though organizations are increasingly required to demonstrate
CHAPTER 2: TRANSLATING DATA PROTECTION INTO PRACTICE
36
compliance with data mandates, these vital accountability measures and processes can be
undermined as companies go through the required motions but often cut corners resulting in
significantly reduced effectiveness.
Undermining Vital Accountability Measures
Two examples of processes that can be undermined when demonstrating compliance
were data inventories and data retention.
Shortcutting Data Inventories. Participants described data inventories as an essential
aspect of compliance and information governance. Inventories involved tracking data and
understanding how it moved through an organization from the second it entered the virtual four
walls of the company to the minute it left its possession, including whom it went to, how, and
why. Mapping the flow and lifecycle of data was key to understanding risk and informing
strategy by identifying weak points and prioritizing how to protect data. This also gave
practitioners a better idea of the applicable laws and relevant responsibilities the organization
had to comply with such as encryption or other special precautions and considerations. Every
participant noted that despite the importance of keeping track of data, many companies often had
very little knowledge about the data feeds going into and out of their systems. Under new
compliance rules companies could be held liable for infractions committed by other companies if
data was improperly handled or shared. Three participants spoke in-depth about conducting data-
inventories, outlining two of the most common techniques including surveys and automated
solutions.
Survey-Methodology. According to interviewees, surveys were the most common
strategy for data mapping. This method involved distributing standardized surveys (often
including checklists) internally to identify stakeholders that came in contact with data in every
CHAPTER 2: TRANSLATING DATA PROTECTION INTO PRACTICE
37
business unit of the company. This process was extremely time-consuming and relied heavily on
department managers to provide accurate and up-to-date information about the data their unit
was collecting and how it was used. PRIVACY_8 explained how quickly this process could
become extremely complex.
The survey can be quite long because if you’re doing it right, what you want to be able to
do is correlate every category of data with every potential source, every potential use,
and every potential recipient. If you think about just those four data points and the
different permutations that you have to track, ideally you’re tracking one data type. It’s
from this source and here’s the five different uses. Of those five different uses, here’s the
ten different recipients and so it becomes incredibly complex pretty quickly even with
fairly small companies. So using a third-party proprietary system makes that a lot easier
especially on the back end where they can do reporting then of course you have to deal
with updating it on an annual basis…ideally. So it’s really hard. –PRIVACY_8
Automated Solutions. The second method was automated solutions. This approach relied
on artificial intelligence (AI) or scanning technology to look at data within the network by
running analytics, identifying, flagging, and pulling data to build maps of the system. Automated
solutions were seen as useful long-term investments for complying with data access requests.
However, the large upfront costs associated with implementing these systems meant leadership
was not always keen to take on this investment. PRIVACY_3 explained how these automated
solutions worked for data access requests in practice.
There’s a lot of nuance and different things. We have a company we work with that will
build APIs [application programming interfaces] from all your third-party SaaS
[software as a service] providers so Salesforce and whatever. Then they’ll identify the
data that they get, give you, or store for you and then when a request is made for data,
they’ll identify that individual, pull the data from the systems into almost like a sandbox,
and say ‘here’s what we have on this person, evaluate it and respond whichever way you
want’ so now that we’ve gathered it for you, you can respond.
So the challenge there is that it takes a little while to do that. You got to build
APIs for each of these systems and then you got a certain time frame to do the looking
and all this stuff- hopefully you’ve done it beforehand saying ‘can we delete that? Can we
not?’ but it takes a lot of the work out and its scalable for sure. If you get 100 requests
for information and you've got to go back to each system to look for it, that’s like a full
time job for somebody. –PRIVACY_3
CHAPTER 2: TRANSLATING DATA PROTECTION INTO PRACTICE
38
In a perfect world, surveys and automated solutions would be used in tandem to account
for limitations with each method. Surveys were almost always based on self-report, meaning
they could only be as informative as the person filling them out. If department managers did not
know or were not honest about the extent of data being collected, there would be no way to
discover or account for these other sources of data. On the other hand, automated solutions were
seen as slightly less reliable than surveys since they did not involve as much input from
employees making it difficult to pick up on “workarounds” or actions that fell outside official,
documented channels or established practices. Another benefit of surveys was how they could be
adapted for any size organization regardless of setup. When given a choice, participants
preferred surveys to automated solutions. PRIVACY_8 provided insights into the logic behind
this decision.
Ideally you’d probably want to employ both but I’ve never run into a company that does
apply both because that would be incredibly costly. The human-based survey is prone to
error because it’s humans completing it. The scan is prone to error because automated
technology is not going to pick up on everything and its not going to be able to really tell
you the story of different use cases of the data, which is incredibly important. –
PRIVACY_8
Regardless of method, data inventories were dependent on people being both
knowledgeable and honest about data practices within the organization. This required
establishing trust with employees and communicating that the goal was not to punish or take
away tools that made employee’s lives easier but help ensure they were not breaking any laws or
creating undue risk for themselves or the organization. PRIVACY_6 described the importance of
conveying the purpose of data inventories to encourage honesty and trust.
CHAPTER 2: TRANSLATING DATA PROTECTION INTO PRACTICE
39
The inventory, if done as a ‘safe’ engagement, can ferret out [the use of unverified third-
party apps and services aka shadow IT and shadow vendors]. If people honestly record
what is being done, how, with what systems/vendors, etc. there is a chance the company
can track, protect, and be compliant with the data. It is not necessarily malicious use by
the employees, but workarounds if the current systems or vendor-selection is slow or not
fitted to current need.
If you get it documented and make it clear this is not punitive, this is not a
disciplinary action, it is going to be hell on earth to get this stuff documented –
acknowledge that up front – but let folks know that what we need is honesty in this. It’s
okay. So you have a vendor that didn’t go through procurement– fine. Put the
information in there so we can go back to it. If you can actually get the true story, if you
can start to hack into shadow IT and shadow vendors and then also see the legitimate
vendors in these processes, it’s kind of spelled out black and white and it makes it easier
to find those very first vendors to go to. –PRIVACY_6
Feasibility of Inventories. Although organizations are increasingly compelled by law to
conduct inventories, policymakers must remember that compliance can come in many different
forms. Simply adding requirements without consideration of the larger context risked
compounding problems by turning inventories into a series of elaborate hoops that companies
must jump through to demonstrate compliance. According to participants, the timeline for
conducting data inventories at SMEs could range anywhere from a few days to over six months.
A fairly comprehensive survey could theoretically be done in about a month but usually took
anywhere between one to six months. This variance was largely due to how complex the
situation was, how serious the organization was about conducting comprehensive inventories,
and how much leadership was willing to spend. For many SMEs, regulation was a completely
new ballgame and companies often struggled with how to wrap their heads around data
governance and understand the full scope of data in their possession. Despite the importance of
data inventories, participants noted that normal, everyday time and resource constraints put
pressure on them to speed up the process even if it resulted in lower quality results. PRIVACY_8
described some of the “imperfect solutions” used to make this process go faster.
CHAPTER 2: TRANSLATING DATA PROTECTION INTO PRACTICE
40
Often times the shortcutting way of doing it, I have charts and questionnaires that I’ll
send to companies and they’ll just try to figure it out. If we’re trying to work really fast,
we’ll just have a long call with certain stakeholders on the phone and I’ll just ask them a
ton of questions. Sometimes it will be like ‘drop the privacy policy’ and it’ll have a
million margin notes in it that’ll be like ‘hey, confirm that you collect this’ or ‘confirm
that you do this.’ I’ll browse around their website and try to figure out what exactly
they’re doing with data. Those are very imperfect solutions but sometimes that's just the
best the company can do. –PRIVACY_8
Endless Data Retention. Data retention policies were another aspect of the information
governance process that often got diminished in practice. Data retention policies outline
company policy for how to distinguish between data that is necessary for the organization to
keep versus unnecessary data that should be deleted. The goal of these policies was to move
away from the industry norm of having troves of unused data sitting on forgotten servers for
“perpetuity.” These guidelines were specific to each organization and designed to help reduce the
risk of data being lost or compromised by prioritizing the data kept in its possession. Part of the
challenge of enforcing data retention policies was that prior to the introduction of the GDPR in
2018, there was no precedent for deleting unused data. Practitioners described the struggle of
introducing a new organizational reality that disposed of unneeded data when it was no longer
relevant to the organization. PRIVACY_6 explained:
[Data retention] is an incredibly weak link in the chain. Lots of clients that I work with
don’t even have retention policies which means by default it just gets kept in perpetuity.
Again that’s where if I had to say businesses could only do one thing ever, it’s that data
inventory because that final column is how long is this data retained for? If you start
having an entire column where the only answer is ‘I don’t know’ then it’s a lot easier to
advocate to leadership that you need some funds to get it done. –PRIVACY_6
Because companies still struggled with conducting data inventories, it was no surprise
that implementing data retention policies was another major challenge for practitioners. Many
companies had not even reached the point where they were actively tackling issues of data
retention and deletion. Interviewees described the resistance they encountered when trying to
purge databases of unnecessary data. According to PRIVACY_10,
CHAPTER 2: TRANSLATING DATA PROTECTION INTO PRACTICE
41
[Companies] are terrible. They don’t often have systems or processes in place to actually
delete data. We keep everything, we’re information packrats today. The principle is you
should get rid of data when you no longer have a business or legal reason to retain it.
But we don’t get rid of data, we just assemble it and it magnifies the issues when there’s
a problem because we’ve over-retained. –PRIVACY_10
Since companies needed to know what data they had in order to get rid of it, enforcing
data retention policy was heavily dependent on organizations conducting comprehensive data
inventories. This process of figuring out how long to keep data and what constituted “relevancy”
was a key point of contention that required negotiation across departments and stakeholder
groups. Interviewees likened their role in this process to ‘mediators’ between different units of
the organization. Rather than trying to strong-arm departments into following their lead, most
preferred a more collaborative approach that involved bringing multiple departments together in
an effort to create a shared understanding about data. One way to do this was to frame the
narrative around risk. Risk was one of the few concepts powerful enough to transcend the divide
between business units and their competing organizational realities. Although definitions and
tolerance for risk varied by department, every stakeholder understood the importance of
managing and mitigating risk to the organization. PRIVACY_6 described how framing the
conversation around risk helped unite multiple stakeholders around a single shared
organizational reality.
Again it’s like an education to the business that legal and IT are not out to steal your
data and delete all your contacts and make your life miserable. It’s like my mantra, ‘if
you do not have the data you are not at risk because of it’ so get rid of what you can and
protect the stuff that you’ve got. But retention is absolutely the weakest link in even well-
developed, robust, internal legal departments don’t have strong- some cases have no
retention policies- but certainly aren’t enforcing it. Again it goes down to the bottom of
the list and it is one that should be up a bit higher. –PRIVACY_6
Participants realized that their success relied heavily on their ability to translate sector-
specific terms into a broader language understood equally across all departments. Risk and risk
management was a language spoken fluently throughout every level of the organization. While
CHAPTER 2: TRANSLATING DATA PROTECTION INTO PRACTICE
42
the C-suite, marketing, or sales might not necessarily understand the language of ‘privacy,’ ‘data
protection,’ and ‘compliance,’ they did understand when this was translated into the language of
risk, liability, and lawsuits. Interviewees pointed out that risk framing alone was often not
enough to motivate immediate action or changes. However, they did note this process began to
lay a foundation for multiple stakeholders to come together and build trust in one another as they
worked to address these issues collectively. Establishing a new, shared reality would take time
and lots of negotiation but hopefully give rise to a new organizational culture that prioritized
shared responsibility, good data governance, and the protection of consumers in addition to
company interests.
Organizations Are Never Fully Compliant
While it would be easy to point fingers and lament the ways data inventories and
retention policies were being conducted, the bottom line was that these were not major priorities
for many SMEs. Even with practitioners attempting to carve out new perspectives and realities
around data, dominant organizational realities still viewed inventories largely in terms of cost,
which ate up valuable time and resources while failing to generate any substantial value or utility
for the organization or individual departments. Deleting data meant closing the door on
potentially lucrative sources of information further down the road. Participants noted companies
were also discouraged from adopting new organizational realities because full compliance was
largely unattainable under this new updated system. PRIVACY_3 explained:
[Businesses] are never fully compliant. I don’t think you can be, that’s the thing. You’re
constantly gathering data and so it’s like ‘are you doing it right every time?’ is the
question you have to answer. To get the basics in place like your policies and all that
stuff- it just depends on how complicated your data collection practices are, how quickly
are you motivated to get this done? –PRIVACY_3
The amount of data being generated and collected every day combined with changing
rules and expectations around governance could be overwhelming and make practitioners’ jobs
CHAPTER 2: TRANSLATING DATA PROTECTION INTO PRACTICE
43
of convincing leadership to adjust their approach an uphill battle. Participants were quick to
point out that many companies have established policies on paper but did not necessarily have
the ability to carry them out. However, the current landscape did not explicitly require companies
to show proof of concept so concerns about actual implementation could be addressed at a future
date. This was exacerbated by mentalities that compartmentalized data responsibility into a
single department like privacy or legal rather than treating it as a responsibility shared by all
departments.
CHAPTER 3: THE EVOLUTION OF THE COMPLIANCE LANDSCAPE
44
Chapter 3: The Evolution of the Compliance Landscape (Study 2)
The data compliance landscape in the United States has undergone a monumental shift
within the last five years. The introduction of comprehensive data protection legislation like the
General Data Protection Regulation (GDPR) and the California Consumer Privacy Act (CCPA)
marked the beginning of a new era, characterized by laws designed to address challenges brought
on by an increasingly digital world. Part of the challenge involved acknowledging the
shortcomings of current systems, particularly the fact that many of the assumptions and
governance structures that comprised the data ecosystem were no longer compatible with the
current regulatory environment in the United States. Law and policy literatures have already
demonstrated that comprehensive data privacy and protection regulations like the GDPR and
CCPA represent a shift in how privacy is defined, understood, and codified into law (Alpert,
2020; Chander, Kaminski, & McGeveran, 2021; Hartzog & Richards, 2020; Nicola & Pollicino,
2020; Yeh, 2018). However, in recent years scholars and experts have expressed disappointment
over the lack of progress on privacy and data protection. Much of the criticism stems from
difficulty exercising rights (King & Stephan, 2021; Turner et al. 2021; Waddell, 2021a), lack of
transparency (Alpert, 2020; Mandalari, Dubois, Kolcun, Paracha, Haddadi, & Choffnes, 2021;
Ranking Digital Rights, 2020), continued violations of privacy (Gundersen, 2020), and a
shortage of strategies for holding companies accountable (Germain, 2021; Lancieri, 2022; Parks,
2022).
One problem is that there is not a good understanding how these laws are operationalized
and put into practice within organizations (also known as ‘privacy on the ground’). The majority
of scrutiny has focused on the legislative or regulatory processes including interpreting laws,
CHAPTER 3: THE EVOLUTION OF THE COMPLIANCE LANDSCAPE
45
investigating violations, or analyzing regulatory actions (Baik, 2020; Gundersen, 2020; King &
Stephan, 2021; Linden, Khandelwal, Harkous, & Fawaz, 2020; Wachter, 2018). However,
industry perspectives about how companies are adapting business practices in response to new
mandates have been notably understudied within academic literature (Bamberger and Mulligan,
2015; Waldman 2018; 2020).
The goal of this chapter was to begin to rectify this gap. By interviewing industry
practitioners tasked with translating these laws into practice, this study explored the evolution of
the data privacy compliance landscape in the United States and provided insights into why so
little progress had been made in terms of improving consumer data privacy and protection. In
contrast to previous scholarship that had studied the topic from a privacy law perspective
(Bamberger & Mulligan, 2015; Waldman 2018; 2020), this research investigated the challenges
experienced by privacy practitioners from a communication perspective. In particular, it
identified institutional factors that impeded and prevented data privacy laws from achieving their
desired effect. This approach to understanding the space utilized concepts from institutional
theory (Battilana and D’Aunno, 2009; Battilana, Leca, and Boxenbaum, 2009; DiMaggio and
Powell, 1983) that provided a lens and vocabulary to view, identify, and understand maneuvers
taking place within the privacy landscape. Putting organizational perspectives and concepts into
conversation with privacy law scholarship was essential for understanding the intersection
between privacy and practice, specifically organizational change and the processes that were
being undertaken by companies as they adapted to new data privacy mandates. Together, this
approach highlighted gaps and opportunities in existing literature to further understandings of
privacy law and policy in practice.
CHAPTER 3: THE EVOLUTION OF THE COMPLIANCE LANDSCAPE
46
This work showed why comprehensive data protection laws have had limited
effectiveness due to the regulatory landscape forcing practitioners to integrate two competing
approaches to privacy with little to no support or guidance. Tensions arose when laws that
adopted a comprehensive approach to data protection were overlaid on top of sectoral systems of
practice. These tensions will continue to undermine consumer privacy protection initiatives
unless they are acknowledged and addressed by industry and government stakeholders.
This chapter sets the stage by providing background on organizational change and why it
can be difficult for companies to adapt to new laws and mandates. This provided a backdrop to
understand the significance of the organizational changes taking place within the privacy
landscape. Borrowing concepts from institutional theory provided a toolkit to help make sense of
the existing debates and incompatibilities in the ‘privacy on the ground’ literature.
Conceptualizing privacy practitioners as institutional entrepreneurs provided a framework to
begin reconciling competing privacy narratives but more importantly offered a theoretical
vocabulary to conceptualize, discuss, and interpret the situation through the lens of the paradox
of embedded agency. The results section discusses 1) the evolving definition of compliance
before and after comprehensive data protection laws (i.e. the GDPR and CCPA) were put in
place from a theoretical and practice lens, 2) the lack of infrastructure for practitioners to ensure
companies follow through on their own privacy policies, and 3) how practitioners overcome
these challenges.
Review of Literature
Approaches to Data Protection
Approaches to data protection represent guiding principles for how data protection and
privacy get conceptualized, codified, and put into practice. Each approach represents
CHAPTER 3: THE EVOLUTION OF THE COMPLIANCE LANDSCAPE
47
fundamental values, assumptions, and end goals that influence how data protection gets defined,
translated, and implemented on the ground. According to Swire and Kennedy-Mayo (2020) data
protection models vary based on how they utilize law, markets, technology, and self-regulation
as sources for privacy protection. The scope, enforcement, and adjudication of laws vary
“depending on how much the specific country relies on government laws versus industry codes
and standards” (p. 57). The two approaches of interest for this study were the comprehensive and
sectoral models. Although in theory the two models represent two points on a continuum, there is
not a clear separation between the two in practice as most systems are not all or nothing but
almost always incorporate some aspects of both models in some way or another.
Comprehensive Approaches to Data Protection
Comprehensive data protection models involve creating general laws that govern “the
collection, use and dissemination of personal information by both the public and private sectors”
(Banisar & Davies, 1999, p. 13). A key feature of these models is that they established a
common set of standards that guarantee certain rights to individuals regardless of context. These
baseline rules apply across the board irrespective of whether an organization is a government or
business entity. Another feature is that there is often an official agency or entity responsible for
enforcement that oversees compliance, investigates breaches of provisions, and acts as a liaison
to the wider community. This often takes the form of providing education and advocating on
behalf of the public as well as interfacing with the international community on data protection
issues (Moshell, 2004; Swire & Kennedy-Mayo, 2020).
The comprehensive model is most commonly associated with the European Union and its
approach to data protection. The EU perspective is grounded in the belief that privacy is a
fundamental human right that is explicitly outlined, protected, and prioritized by law and
CHAPTER 3: THE EVOLUTION OF THE COMPLIANCE LANDSCAPE
48
governance structures (Hartzog & Richards, 2020; Pernot-Leplay, 2020). The General Data
Protection Regulation (GDPR) represents the most extensive codification of comprehensive data
protection legislation around the world to date (see Hoofnagle, van Der Sloot, and Borgesius
(2019) for a detailed discussion of the GDPR and its significance to privacy and data protection
around the world). According to Hoofnagle et al. (2019) “the GDPR implements constitutional
commitments, ones that are deep and occupy a central place in the self-conception of a new,
information age political body.” They argue that this results in the GDPR being “the most
consequential regulatory development in information policy in a generation [by bringing]
personal data into a complex and protective regulatory regime” (p. 66).
As Swire & Kennedy-Mayo (2020) point out, enforcement and funding are critical to the
success of comprehensive models to ensure laws are applied and upheld throughout the economy
and society. Critics argue this ‘one-size-fits-all’ approach does not adequately address risk while
burdening organizations with unnecessary requirements, controls, documentation, and audits that
only raise costs but stifle opportunities for innovation.
Sectoral Approaches to Data Protection
On the other hand, sectoral laws regulate personal information by enacting laws that
address a specific sector or industry (Woodard, 2004). Unlike the comprehensive approach, rules
involving the protection and treatment of personal data can vary widely depending on the
specific context in which it was collected. This means there are different standards that can apply
to data depending on the industry and whether an organization is considered to be a public or
private entity (Acosta, 2018; Pike, 2020; Woodard, 2004). Certain fields such as finance and
healthcare have additional protections to account for the sensitive nature of data (Mulligan,
Freeman, & Linebaugh, 2019). Because rules vary by sector and privacy is distributed across
CHAPTER 3: THE EVOLUTION OF THE COMPLIANCE LANDSCAPE
49
multiple contexts, there is no single entity or agency that is responsible for overseeing or
enforcing privacy (Banisar & Davies, 1999).
The sectoral model is most commonly associated with the United States. What
distinguishes this approach is the idea that data’s value is “highly context-specific and not
sensitive in itself” which allows protections to vary based on the situation (Pernot-Leplay, 2020,
p. 103; Roos, 2006). Rather than seeing privacy as a fundamental human right that requires
standardized rules across all sectors, the US approach does not explicitly protect privacy as a
constitutional right. Instead, it prefers to ground privacy in a consumer protection framework
(Brill, 2011; Ohlhausen & Okuliar, 2015; Waller, Brady, Acosta, Fair, Morse, & Binger, 2011).
The major benefit of this approach is the ability to recognize and account for different levels of
risk based on context, which leads to cost savings, less regulatory burden, and more
opportunities for innovation (Richards, Serwin, & Blake, 2022; Swire & Kennedy-Mayo, 2020).
However, according to Pernot-Leplay (2020) this lack of explicit provisions can pose problems
when new privacy mandates conflict with well-established constitutional rights, leading to less
robust or “watered-down” protections. He explains, “the main consequence is that the protection
of personal data is always balanced against other interests such as commerce or free speech” (p.
115). The lack of authority and distributed nature of privacy also made the sectoral model
difficult to implement and enforce. Critics have increasingly pointed to its weaknesses in keeping
up with technology, leaving major gaps in privacy coverage. Compared to the comprehensive
approach where rules automatically apply to new technology, the sectoral model requires
legislation or a rule-making authority to specifically articulate provisions for that technology in
order for protections to take effect. As a result, when “the boundaries between industries change
CHAPTER 3: THE EVOLUTION OF THE COMPLIANCE LANDSCAPE
50
over time, previously separate industries can converge, potentially leading to different legal
treatment of functionally similar activities” (Swire & Kennedy-Mayo, 2020, p. 62).
The US federal government’s slow approach to addressing privacy has resulted in a
fragmented environment characterized by sector clashes and a patchwork of state laws
(Goodyear, 2020). In fact, Hartzog & Richards (2020) describe the American privacy landscape
as a “complicated hodgepodge of constitutional law, piecemeal federal statutes, state laws,
evidentiary privileges, contract and tort law, and industry guidelines” (p. 1697). In absence of a
comprehensive federal privacy law, California has taken the lead and with its own
comprehensive privacy laws known as the California Consumer Privacy Act (CCPA) and more
recently the California Privacy Rights Act (CPRA). Modeled after the GDPR, California’s
comprehensive laws have become the de facto standard for privacy in the United States (Chander
et al., 2021).
Structures, Agency, and the Difficulty of Organizational Change
Organizational change is one of the most well researched topics in organization and
management literatures. Organizations are resistant to change not only because of physical
constraints like existing structures, resource allocation, company composition, and other factors
but also non-tangible elements such as culture, norms, values, and patterns of practice. Of the
applicable theories, institutional theory was chosen for its utility in helping illustrate the role of
actors and their individual agency within the systems they are trying to change. First I provide an
overview of institutional theory, followed by a brief discussion about how it has evolved over
time with a particular emphasis on how it relates to privacy on the ground scholarship.
CHAPTER 3: THE EVOLUTION OF THE COMPLIANCE LANDSCAPE
51
Institutional Theory
Similar to other theories in social science, institutional theory is also concerned with the
long-standing ‘agency versus structure’ debate. Each side of this debate represents a distinct set
of assumptions that characterize the relationship between actors and their environments
(Battilana & D’Aunno, 2009). Institutional theory enters the conversation by privileging social
and cultural processes, arguing that these institutions shape organizations and patterns of action
(DiMaggio & Powell, 1983; Lounsbury & Ventresca, 2003; Meyer & Rowan, 1977). A key
aspect that differentiates institutional theory from economic theories, such as the rational actor
model, is the focus on legitimacy and its relationship to the overarching environment within
which actors are embedded. Rather than seeing companies as driven solely by financial interests
and maximizing the bottom line, institutional theory posits that legitimacy is actually the main
motivator of action. According to Battilana & D’Aunno (2009), “While rational actor models
tend to neglect environmental influences on actors’ decisions, institutional theory takes into
account these external influences by assigning a key role to legitimacy considerations in actors’
decision processes” (p. 35). The quest to establish legitimacy in the eyes of the public is what
drives organizations to conform to institutional forms of their environment. Early neo-
institutional theory focused heavily on structural elements such as cultural processes that shaped
human agency and behavior within organizations leading to homogenization and isomorphism
(DiMaggio & Powell, 1983). According to Battilana & D’Aunno (2009) this was because
“legitimacy has a central role in neo-institutional theory as a force that constrains change and
pressures organizations to act alike” (p. 35).
Neo-institutional Theory. Early neo-institutional theory was characterized by a
deterministic orientation focusing on the overarching cultural processes and institutions that
CHAPTER 3: THE EVOLUTION OF THE COMPLIANCE LANDSCAPE
52
constrained behavior and practice. According to Battilana & D’Aunno (2009), “the determinist
orientation focuses not on action, but on the structural properties of the context within which
action unfolds, and on structural constraints that shape individual or organizational behavior and
provide organizational life with overall stability and control” (p. 33). Human agency is assumed
to be limited as people are conditioned by their external environments rather than motivated by
free will. Scholars in this area are often concerned with how these institutions work, how they
are built, reinforced, whom they serve, and their impact on people. As Lawrence, Suddaby, and
Leca (2009) point out, “In this view, institutions not only influence how agents will act, but
which collective or individual actor in a society will be considered to have agency and what such
agents can legitimately do” (p. 4).
Neo-Institutional Theory and Privacy. The neo-institutional orientation to privacy can
be seen most notably through the work of Waldman (2018). Waldman’s (2018) privacy narrative
could be characterized in terms of a neo-institutional lens as his study on technologists and
lawyers discusses cultural processes that impact organizational structures and practices while
contributing to isomorphism in the field. In particular, he highlights an environment that is
heavily influenced by cultural practices, which constrain efforts to adopt more privacy-protective
norms into everyday practice. The fact that privacy is not a priority and routinely superseded by
other considerations such as user experience, speed, functionality, and security contributes to its
treatment as a “compliance mandate” rather than a core value of the company. As a result,
privacy structures are rendered insufficient thanks to significant variability in privacy definitions,
insufficient corporate structures of support, and narrow understandings that limit the translation
of privacy design principles into practice (Waldman, 2018).
CHAPTER 3: THE EVOLUTION OF THE COMPLIANCE LANDSCAPE
53
According to Lawrence et al. (2009) Waldman’s approach aligns well with early neo-
institutional understandings of organizations which place “a strong emphasis on the cultural
processes through which institutions affected organizational practices and structures and led to
patterns of isomorphism within fields of activity” (p. 3). In this case, organizations are motivated
to adopt practices like privacy notices and specific organizational structures not simply because
they want to be like their peers but because these actions are associated with establishing
credibility and legitimacy as an organization. Despite early success, a major critique of the neo-
institutional approach was its inability to account for organizations evolving over time. In
response, later versions of institutional theory focused on how organizational change takes place,
thus giving rise to scholarship on institutional entrepreneurship.
Institutional Entrepreneurs. While the classical neo-institutional lens has dominated
institutional theory scholarship throughout much of the 20
th
century, the 1990s represented a
turning point. For many years, neo-institutional theory had been criticized for neglecting agency
and lacked the ability to account for institutional change (Abdelnour, Hasselbladh, & Kallinikos,
2017). Known as the agentic turn, DiMaggio (1988) introduced the concept of institutional
entrepreneurship, which stressed the role of actors and agency in institutional change. He stated,
“new institutions arise when organized actors with sufficient resources (institutional
entrepreneurs) see in them an opportunity to realize interests that they value highly” (DiMaggio,
1988, p. 14). From here, scholars turned their attention to the role of individuals and
organizations as they innovated, strategized, and worked towards institutional change (Battilana
& D'Aunno, 2009). The focus was no longer on strategies used by actors to comply with
institutional arrangements but instead how to change them (Lawrence et al., 2009). In their
review of institutional entrepreneur scholarship, Weik (2011) found this group was often
CHAPTER 3: THE EVOLUTION OF THE COMPLIANCE LANDSCAPE
54
characterized as “agents pursuing certain interests and acting strategically” with an ability to
create meaning as well as mobilize resources and other actors. Other conceptions referred to
institutional entrepreneurs as leaders, visionaries, and reflexive agents (p. 3).
Institutional Entrepreneurs and Privacy. This orientation can be seen within privacy
primarily through Bamberger & Mulligan’s (2015) work studying privacy practitioners on the
ground. The institutional entrepreneurship lens is particularly fitting for Bamberger & Mulligan’s
work, which stresses the leadership role taken by chief privacy officers (CPOs) as they drive
development of new institutional processes and organizational change around privacy. Much like
CPOs line of work, institutional entrepreneur scholarship is “typically concerned with the
purposeful activities of individuals and groups that lead (directly or indirectly) to the adoption of
various policies and practices by organizational targets, or even the founding of entirely new
forms of organizations” (David, Tolbert, and Boghossian, 2019, p. 5).
Specifically, Bamberger & Mulligan (2015) credit the “centrality of the CPOs location
and policymaking autonomy within the corporate structure, and the way in which external
engagement shapes the CPOs substantive focus” (Bamberger & Mulligan, p. 76). In terms of
agency, Battilana & D’Aunno (2009) note that those who propose new rules and practices that
diverge from established norms and institutions are often seen as having high levels of agency.
Examples of this can be seen in Bamberger & Mulligan’s (2015) privacy narrative that
recognizes privacy leaders’ ability to frame privacy debates strategically, operate as powerful
and relatively autonomous actors, and bridge diverse stakeholder groups both inside and outside
their respective organizations. In particular, their work paints a picture of empowered CPOs able
to leverage the dynamic and ambiguous privacy landscape in a way that “fostered internal
CHAPTER 3: THE EVOLUTION OF THE COMPLIANCE LANDSCAPE
55
reliance on their professional judgment, which in turn afforded them greater autonomy and
power within their organizations” (Bamberger & Mulligan, 2015, p. 12).
Shortcomings of Institutional Entrepreneurship. In terms of institutional theory,
institutional entrepreneurship has been criticized as overly simplistic, depicting actors as
superheroes immune from institutional pressures, yet somehow able to transform organizations
and environments at will (Abdelnour et al., 2017; Suddaby, 2010). Lawrence et al. (2009) point
out that “while this research has provided valuable insights, such work tends to overemphasize
the rational and ‘heroic’ dimension of institutional entrepreneurship while ignoring the fact that
all actors, even entrepreneurs, are embedded in an institutionally defined context” (p. 5). Many
of the same theoretical critiques of institutional theory can also be seen within the privacy on the
ground literature.
Limitations of Existing Privacy Narratives. Although both privacy narratives are
compelling in their own right, there are limitations to each approach. For instance, Bamberger &
Mulligan’s (2011, 2015) seem to overlook major structural challenges involved in trying to
change a system while still operating inside it (DiMaggio and Powell, 1991; Garud, Hardy, and
Maguire, 2007; Holm, 1995; Seo and Creed, 2002). Waldman (2018) critiques Bamberger &
Mulligan’s (2011, 2015) work expressing skepticism about the extent to which visionary privacy
initiatives are actually implemented. In particular, he questions whether CPO’s “dynamic,
forward-looking, and trust-based approach to privacy” has been embedded within company
cultures especially because these individuals are not usually directly involved in technology
design or development processes (Waldman, 2018, p. 725). He states “how are CPOs, business-
line executives, and unit-specific privacy leads ‘baking’ privacy into design if none of them
actually design anything? […] CPOs are not designers and many of the technology products we
CHAPTER 3: THE EVOLUTION OF THE COMPLIANCE LANDSCAPE
56
use today seem to be designed without our privacy in mind” (p. 674 & 725). Using his own
primary research with technologists and lawyers, Waldman (2018) shows how little privacy
norms are actually integrated into everyday design and practice due to structural factors and
cultural processes. He concludes that there are two parallel ‘privacy on the ground’ narratives
suggesting Bamberger & Mulligan’s conception of ‘robust privacy norms’ diffusing throughout
organizations has yet to be fully realized.
On the other hand, Waldman’s approach also has limitations due to its inconsistency in
how human agency is applied and characterized within the privacy ecosystem across his body of
work. Compared to his 2018 study, which highlights the deterministic structures, forces, and
practices that constrain technologist and lawyer behavior, Waldman’s (2020) conception of
privacy professionals is more aligned with a voluntarist orientation which stresses the ability of
these actors to choose between multiple courses of action. This acknowledgement of options
suggests that Waldman believes CPOs and compliance professionals are empowered decision-
makers with substantial impact over the course of action taken by their organization. He states,
“although there are many privacy professionals advocating strongly and effectively for privacy
prioritization inside technology companies, some professionals frame privacy law compliance as
a means of minimizing the risk to the company, not protecting consumers from data use harms”
(Waldman, 2020, p. 799). By emphasizing the ability of CPOs and compliance professionals to
make decisions about what the law means in practice, Waldman (2020) asserts they are
responsible for steering company policies towards managerialism:
CHAPTER 3: THE EVOLUTION OF THE COMPLIANCE LANDSCAPE
57
Together [neoliberal managerialism and discourse] create a political environment that
prioritizes efficiency and demonizes regulation while imbuing the legal consciousness of
corporate actors with the notion that leveraging merely symbolic structures is morally,
politically, and even legally acceptable. That might suggest that the legal endogeneity
narrative described in this Article is more a systemic problem with today’s political
economy and compliance culture rather than a problem unique to privacy. That is both
right and wrong. Though focusing on employment discrimination and the merely
symbolic structures erected to comply with Title VII, Edelman suggested that some of the
blame lay with ‘compliance professionals’ who, in part because of ‘their professional
training and roles,’ will frame the law in managerial ways. (Waldman, 2020, p. 824).
This influence on organizational environments can be traced back to these practitioners’
interpretation of laws, creation of corporate policies, and development of internal programs that
indirectly work to establish regulatory standards across the industry. Waldman (2020) explains,
“The compliance ecosystem, from lawyers to privacy professionals to engineers, dominates the
social practice of privacy law because these compliance professionals, not legislators or
regulators, embed their vision into corporate practice and technology design. And these groups
create structures that proliferate throughout the privacy compliance market, thus impacting the
legal consciousness; judges, regulators, lawyers, and even consumers are starting to assume that
the mere presence of compliance structures is evidence of substantive adherence with the law” (p.
792). Although at odds with his past work, Waldman’s (2020) conception of CPOs as influential,
empowered actors is in line with existing privacy on the ground literature, specifically
Bamberger & Mulligan (2011; 2015).
A New Path Forward in Privacy
While privacy on the ground literature has labeled diverging interpretations as “parallel
narratives” (Waldman, 2020), organizational scholarship can act as a guide offering a lens and
vocabulary to delve into the debate in a way that incorporates strengths from each narrative
while striking a balance between agency and structure. This is done using a concept that is
known as the paradox of embedded agency.
CHAPTER 3: THE EVOLUTION OF THE COMPLIANCE LANDSCAPE
58
Paradox of Embedded Agency. The paradox of embedded agency relates to the age-old
‘agency versus structure’ debate and essentially poses the question, how can actors who are part
of (or embedded) in an institutional environment realistically be expected to change that
institution or imagine alternatives (Battilana & D'Aunno, 2009; Holm, 1995; Seo & Creed, 2002;
Weik, 2011)? This conundrum gained prominence as a result of early critiques, previously
discussed above, involving institutional entrepreneurship being overly simplistic and inattentive
to overarching institutional forces and structures (Abdelnour et al., 2017; Lawrence et al., 2009;
Suddaby, 2010). In their highly influential paper, Battilana et al. (2009) argue that despite the
previous shortcomings of institutional entrepreneurship literature, the concept is vital to the
development of institutional theory because it enables exploration of the varying degrees of
human agency that exist within systems. For them, the challenge lies in understanding the
dynamic interaction between individual choice, embeddedness, structural determinism, and
change processes. They explain, “In this view, culture, institutions, and social relations are not
just toolkits for actors; they influence actors’ cognition and actions in important and often
unconscious ways. Their status as social facts implies a need to account for institutions and
social relations that constrain and enable, but do not determine, the choices of actors, and for the
recursive nature of relations between institutions and actions” (Battilana et al., 2009, p. 73).
Institutional Entrepreneurs as Change Agents. Central to this argument is the role of
institutional entrepreneurs as change agents that leverage resources and initiate divergent
changes that break the institutional status quo (Battilana et al., 2009). Building on DiMaggio
(1988), Battilana et al. (2009) point out that although all institutional entrepreneurs are change
agents, the reverse is not true. This is because institutional entrepreneurs by definition actively
participate in the creation of new institutional templates that cause them to “face specific
CHAPTER 3: THE EVOLUTION OF THE COMPLIANCE LANDSCAPE
59
challenges arising from other actors’ institutional embeddedness and possible political opposition”
(p. 68). These institutional templates are commonly known as institutional logics that represent a
field’s “shared understanding of the goals to be pursued and how they are to be pursued” (p. 69).
While change agents can engage in either non-divergent (changes aligned with existing
institutions) or divergent (changes that break with existing institutions) action, institutional
entrepreneurs must engage in divergent change, which can either take place within an
organization or the larger institutional environment in which the actor is embedded (Battilana et
al., 2009).
Enabling Conditions. Enabling conditions help unravel the paradox of embedded
agency by identifying factors that facilitate institutional entrepreneurship (Battilana, 2006). The
two categories of enabling conditions important for this study are field characteristics and actor
social position.
Field-Level Conditions. Studies have shown four field-level conditions to be correlated
with agency (Leca, Battilana, & Boxenbaum, 2008). The first is jolts and crises, which represent
external events or phenomena that impact the overall environment. This can include regulatory
changes, technological disruptions, and social upheaval, which disturb the socially constructed
status quo and generate new situations, ideas, demands, and expectations in response (Battilana
et al., 2009; Hoogstraaten, Krenken and Boon, 2020; Leca, Battilana, and Boxenbaum, 2008).
Second are acute problems which usually precipitate jolts and crises. These are typically
complex and multi-faceted such as scarcity of resources or environmental issues that can spur
agency through a need for new collaborations or processes (Leca et al., 2008). Third is the
degree of heterogeneity of institutional arrangements. The more variance that exists in terms of
pursuing alternative arrangements or doing things in different ways means more opportunity for
CHAPTER 3: THE EVOLUTION OF THE COMPLIANCE LANDSCAPE
60
change and agency (Leca et al., 2008). The fourth field-level condition is degree of
institutionalization, which refers to the level that norms, practices, and values are used, accepted,
and taken for granted within a field (Leca et al., 2008). Fields with high levels of
institutionalization tend to discourage deviation from the established order, which in turn limits
institutional entrepreneur agency (Hoogstraaten, Krenken, & Boon, 2020). Dorado (2005)
developed a typology of action to facilitate analysis of agency based on heterogeneity of
institutional arrangement and degree of institutionalization. Organizational fields that are
opportunity transparent are characterized by a moderate level of heterogeneous arrangements
and a substantial degree of institutionalization offering considerable opportunity for action.
Opportunity hazy fields are those with high levels of heterogeneity and low levels of
institutionalization leading to highly unpredictable environments. The uncertainty makes it
difficult for actors to strategize causing them to act cautiously, holding onto well-established
routines or adopting sense-making behaviors (Dorado, 2005). Opportunity opaque fields are
highly institutionalized and homogenous, usually resulting in few opportunities for action or new
ideas due to the isolation from outside fields (Battilana and D’Aunno, 2009; Dorado, 2005).
Actor Social Position. While field-level conditions are important for institutional
entrepreneur agency, not all actors embedded in a field will have the opportunity to become
institutional entrepreneurs. This is due to social position, a condition that has been theorized as a
key variable mediating the relationship between an actor and their embedded environment.
According to Battilana et al. (2009) “Field characteristics are likely to influence whether actors
become institutional entrepreneurs, but actors perceive field conditions differently depending on
their social position in a field, which influences their ‘point of view’ about the field and gives
them differential access to resources” (p. 74). Actor position has been shown to influence access
CHAPTER 3: THE EVOLUTION OF THE COMPLIANCE LANDSCAPE
61
to resources, capacity for relationship building, perceived legitimacy, authority, ability to
leverage institutional contradictions and alternative orders, as well as take advantage of bridging
opportunities between fields (Hoogstraaten et al., 2020).
Privacy on the Ground and the Paradox of Embedded Agency
The conditions discussed above are key to enabling agency and helping explain the
experience of embedded actors on the ground. For privacy, this offered an alternative way to
understand how certain actions and events that occurred within organizations contribute to the
overall privacy landscape and the spread of norms and practices throughout industry. Exploring
challenges encountered by privacy practitioners and the techniques they used to navigate this
environment was important for advancing knowledge about corporate practices and privacy on
the ground as a whole, including these systems’ strengths, weaknesses, shortcomings, and
capacity for change.
Even though comprehensive and sectoral approaches to data protection share many of the
same processes and practices, the overarching goals they work toward and how they get there
often differ since the motivations that drive policy and definitions of success can play out in
similar yet very divergent ways. While the sectoral versus comprehensive approach to privacy
has been debated heavily in privacy and policy literature (Hartzog & Richards, 2020; Hoofnagle
C. , 2010; Pernot-Leplay, 2020; Richards, Serwin, & Blake, 2022) this study extended this
discussion by combining it with organizational change scholarship to see how the two intersect
to further understanding of the privacy on the ground space. Research questions for this study
were:
RQ1: How has the data privacy compliance landscape in the United States evolved since
2018?
CHAPTER 3: THE EVOLUTION OF THE COMPLIANCE LANDSCAPE
62
RQ2: What challenges have practitioners encountered as they implemented data
mandates within organizations?
RQ3: How have privacy compliance challenges been overcome?
Method
Data Collection
See Study 1 for detailed information about interviews, participants, recruitment, and
sample.
Qualitative Analysis
Study 2 was interested in the role of practitioners and institutional dynamics within the
organizational change process. Unlike Study 1, which did not have any preconceived theory
guiding the analytic process (a “bottom-up” approach), Study 2 intentionally built on theoretical
codes and constructs from the literature to gain a fuller understanding of the privacy landscape.
According to Thornberg & Charmaz (2014) “theoretical codes consist of ideas and perspectives
that researchers import to the research process as analytic tools and lenses from outside, from a
range of theories. Theoretical codes refer to underlying logics that could be found in pre-existing
theories” (p. 159). Using preexisting theory to code and analyze the current study’s corpus of
data helped generate new insights by focusing attention on more subtle or less obvious aspects of
the dataset such as specific ideas, terms, and relationships between categories (Charmaz, 2006).
This type of analysis is often referred to as deductive or a priori. Deductive approaches
usually involve analyzing data from the “top-down” with existing theoretical concepts serving as
a mechanism to sort data into predetermined categories based on the literature (Bingham &
Witkowsky, 2022; Braun et al., 2019). Using the same corpus of data from Study 1 allowed for
greater exploration of study phenomena with previous research frames from the literature serving
CHAPTER 3: THE EVOLUTION OF THE COMPLIANCE LANDSCAPE
63
as a lens to inform interpretation of current data while subsequently allowing findings to be put
in dialogue with existing research.
Previous themes from Bamberger & Mulligan (2015) and Waldman (2020) were
identified and flagged as extremely relevant for this research. Utilizing a two-cycle coding
process (Miles, Huberman, & Saldana, 2014), the first-cycle employed provisional coding based
on Bamberger & Mulligan’s (2015) and Waldman’s (2020) themes of privacy. For Bamberger &
Mulligan, the theme of privacy as trust was used in provisional coding along with its associated
sub-code of risk management. Waldman’s (2020) themes included ambiguity and process in
privacy law, framing corporate obligations as risk avoidance, and symbols of compliance. These
provisional codes helped to inform the categorization of the current study’s data by providing a
coding scheme to help group data based on existing theory. This process provided a framework
to understand the current corpus of data in relation to existing research. While provisional codes
informed how data was initially classified during first-cycle coding, the second-cycle used focus
coding to group codes based on emergent themes. Focused coding is a process that categorizes
coded data based on themes and conceptual similarity (Saldaña, 2014). A conceptual map of
themes can be found in Figure 2.
CHAPTER 3: THE EVOLUTION OF THE COMPLIANCE LANDSCAPE
64
Figure 2
Study 2 Themes
Results and Discussion
Compliance mandates have traditionally served as a catalyst for companies to engage in
necessary change. Establishing a minimum standard of expected behavior allowed companies to
orient their processes towards a certain benchmark and gain legitimacy as an organization if they
can demonstrate they have met the bar for compliance. Study results highlighted internal
organizational dynamics and practitioner experience translating new compliance mandates into
practice. Interviewees discussed how comprehensive data protection legislation had altered the
CHAPTER 3: THE EVOLUTION OF THE COMPLIANCE LANDSCAPE
65
compliance landscape by changing the definitions of what it meant to be compliant and how
compliance could be attained. This transition necessitated the development of new company
cultures and organizational realities around data that were focused on greater accountability and
oversight of data.
However, this process was complicated by the fact that completely different models of
data protection were being merged together with little formal guidance or clarity. Comprehensive
data protection laws like the General Data Protection Regulation (GDPR) and the California
Consumer Privacy Act (CCPA) were pitted against sectoral systems of practice, giving rise to
tensions on the ground. Without formal authority to make changes and push back against
traditional sectoral values, structures, and processes, practitioners were severely limited in the
amount and scope of work that could be done. Findings from this section illustrated how privacy
practitioners struggled to reconcile two conflicting models of regulation and account for the
values, assumptions, and practices ingrained and reinforced by the US sectoral approach to
privacy. First, this chapter discusses the evolving perception of compliance. Interviewees
described compliance as a continuum in regards to what it has come to mean in theory and
practice pre- and post-GDPR and CCPA. Next, results highlight how the two approaches to
privacy can be seen on the ground and the tensions and problems that result when comprehensive
approaches to privacy are layered on top of sectoral systems of practice without appropriate
institutional foundation or support.
Evolving Definitions of Compliance
Pre-GDPR Compliance
When asked to describe how companies were responding to new data privacy legislation,
participants often brought up the distinction between pre- and post-GDPR compliance. Prior to
CHAPTER 3: THE EVOLUTION OF THE COMPLIANCE LANDSCAPE
66
2018, compliance aligned with the sectoral approach to data protection and was characterized by
lack of enforcement, self-determination, and market regulation. Many governments around the
world primarily relied on a laissez faire, hands-off approach that allowed companies to operate
largely unchecked with very little oversight or government interference (Swire and Kennedy-
Mayo, 2020). Even though the European Data Protection Directive had been on the books since
1995, the GDPR changed the game by introducing the threat of enforcement and establishing a
default standard for data protection around the world. Before this, there was very little consensus
in the United States around data governance. Participants explained that standards varied
considerably by sector with certain fields like finance and healthcare having strict protocols
around what qualified as data and how it had to be handled versus other sectors which had few if
any regulations around data. For companies outside of finance and healthcare, as long as they
met minimum standards they could generally avoid any enforcement actions from government
agencies like the Federal Trade Commission (FTC), assuming they did not violate their own
policies or engage in any unfair or deceptive practices (Swire and Kennedy-Mayo, 2020). All
interviewees stressed that the issue of data privacy was largely separate from everyday business
decision-making and there was not a high level of awareness within companies about the larger
data privacy environment. In general, companies were given the freedom to decide if, when, and
how they wanted to address data privacy. Because of these conditions, the GDPR represented a
major ‘jolt and crisis’ by ushering in a new era of privacy standards and practices that put
consumers’ privacy front and center. One participant, a privacy consultant who has worked in the
field for over 20 years, described the paradigm shift that began in 2018.
CHAPTER 3: THE EVOLUTION OF THE COMPLIANCE LANDSCAPE
67
In comes the GDPR that actually puts some money, teeth, and enforcement behind the
[European Data Protection] Directive that had been in place for many years. What’s
interesting is that businesses globally had an ‘oh my god’ moment as this was heading
into enactment, except for the fact that there had always been requirements around this
but nobody paid any attention. What we saw was a demonstration of having a lack of
enforcement or enforcing entity. If there’s no stick behind the carrot of good behavior,
there probably won’t be. –PRIVACY_6
Even though the GDPR only indirectly affected companies in the United States, all
eighteen participants emphasized the massive shift it represented to the field by threatening the
way companies have traditionally done business. On January 1
st
2020 the California Consumer
Privacy Act (CCPA) came into effect, which built on the same comprehensive data protection
framework laid out in the GDPR, extending similar protections to residents of California.
Although specific to California, many experts argued the legislation, nicknamed ‘GDPR-lite,’
would have a ripple effect on companies across the country. The legislation attempted to address
gaps in consumer privacy while moving away from the United States’ traditional sectoral
approach to privacy in favor of the European Union’s comprehensive approach to data protection.
Together, these two pieces of legislation not only changed compliance standards but also started
to challenge long-standing sectoral values and practices such as self-regulation and the use of
notice-and-consent (sometimes referred to as notice-and-choice) to release companies from data-
related liabilities. Participants pointed out that while compliance was traditionally used to
indicate good practice, the introduction of comprehensive legislation began to alter the definition
of “compliance,” which started to chip away at the assumptions underlying this relationship,
putting pressure on companies to define and articulate what ‘good practice’ actually meant rather
than automatically equating it with compliance.
Post-GDPR and CCPA Compliance- In Theory
When asked what compliance meant post-2018, participants were careful to distinguish
between compliance in theory and in practice. When talking about compliance in theory or what
CHAPTER 3: THE EVOLUTION OF THE COMPLIANCE LANDSCAPE
68
it ideally should look like, interviewee responses mirrored comprehensive data protection
scholarship, describing a new perspective that built on privacy as a fundamental human right and
viewed privacy regulation as a dynamic, unstable, and complex environment. Rather than
treating privacy as an obligation of business, it now became a moral responsibility for the
company to protect consumer data and privacy. Part of this process involved introducing
standardized rules and regulations across previously unregulated sectors in an attempt to build a
new culture of accountability around data. The idea was for companies to move away from
seeing compliance as meeting ‘minimum standards’ to develop a more proactive and forward-
thinking approach to data governance. Rather than waiting on governments to release checklists
that detailed every standard that companies must comply with, this approach attempted to shift
some of the responsibility onto companies, having them become more invested and accountable
for their own data practices. PRIVACY_10, another consultant specializing in global compliance
and privacy management, summarized why this ‘minimum standards’ mentality was no longer
appropriate for the current post-GDPR/CCPA compliance environment.
The problem with just achieving compliance is like getting a 51% passing grade in school.
You’ve just made it. The problem is that the test is never standing still. There are always
new threat actors, new vulnerabilities, new ways people can be socially engineered, and
the laws are not standing still. The expectations are rising. If you aim for 51% today, it’ll
be 49% tomorrow. You shouldn’t be aiming for mere compliance, you should be aiming
for the capacity to comply with whatever rules that are likely to come your way. Really
you should be aiming for I'd say 80% is a good mark. Aiming for 80% means you’ve got
some cushion against changing expectations but more often it’s about having the
capacity to rise to the occasion.
We’re not in a static world either from a regulatory standpoint or from a risk
standpoint. People are coming up with new and exciting uses of information all the time.
Data analytics, AI –we want those things. AI could potentially solve so many issues
relating to pandemics, for instance, if we use them properly. But we can also use it
improperly and create an Orwellian state. So we’ve got choices to make and if companies
are making the choice to go for mere compliance, you’re always going to be chasing that
ball and never actually comfortable. –PRIVACY_10
CHAPTER 3: THE EVOLUTION OF THE COMPLIANCE LANDSCAPE
69
This quotation encapsulated a core assumption of compliance under the comprehensive
model on the ground. Specifically it acknowledged that data environments and internal company
ecosystems were much too complex, diverse, and variable for governments to micromanage and
regulate in the same ways it traditionally had in the past under the sectoral model. Rather than
relying on governments to lay out every detail of compliance standards for every different sector,
it shifted some of the onus onto companies themselves who were expected to use broad
government mandates as a guide to set their own benchmarks and individualize what compliance
meant for their company. By expecting companies to get more involved and take greater
ownership over their internal data environments, processes, and standards, privacy would ideally
become built into the organizations themselves.
Participants stressed this required the development of new organizational realities since
employees at all levels of the company had to be aware of what privacy was, why it was
important, and the rules they must comply with in addition to how each employee was
responsible for integrating and upholding these values in practice. Letting go of old compliance
mentalities and reorienting company strategies to be more proactive and responsive to
developments on the horizon required anticipatory action and all hands on deck collaboration
among departments. In order for this to happen, companies must engage in negotiation to define
what good practice meant for them and how to set their own benchmarks accordingly.
However, the question of what constituted good practice was not quite clear-cut due to
the number of interests at stake. How practitioners aligned their work on this spectrum was very
much influenced by how open their company or employer was to privacy changes in the first
place. PRIVACY_8, an attorney specializing in privacy and data security at a national law firm,
discussed how this subtle shift in orientation could translate to actual changes in strategy on the
CHAPTER 3: THE EVOLUTION OF THE COMPLIANCE LANDSCAPE
70
ground. He noted that his firm preferred to take a holistic approach to helping companies comply
with privacy mandates, “I think where some law firms might tend to be over combative in
resisting a lot of these changes, our approach typically is to be more embracing of them and help
companies by providing real-world, practical solutions for how to navigate them. That’s one of
our philosophies.”
Participants were adamant that in order for privacy to be seen as a core interest of a
company, privacy work could not longer be a separate mandate or process that must be fulfilled.
Instead, it had to be treated as a core value that was equally important to profit and factored into
all major decisions. Under this organizational reality, privacy would become an essential part of
the brand, business identity, and basic functioning of the organization. This updated conception
of compliance required internal company structures to buttress the work of privacy practitioners
as they engaged in divergent organizational change, introducing new organizational realities and
institutional logics like values, norms, and patterns of practice around data. All participants
emphasized the importance of strong leadership to encourage and support cross-departmental
collaboration in order to break down silos that delineated different aspects of business.
PRIVACY_6 discussed what it took to be compliant under this updated model.
There’s no way to be compliant with GDPR or with CCPA unless you have the full-on,
hands-on participation from the business because it’s the business teams [that] are
actually ingesting and working with this data –be it employee data or consumer data. The
work is being done by teams that are neither legal nor IT. –PRIVACY_6
Prior to this shift, many internal departments could work relatively independently only
crossing over on an as-needed basis. This departure from the traditional sectoral model, in theory,
allowed companies to have an easier time meeting and exceeding minimum data protection
standards because there was one overarching set of standards rather than a collection of
disjointed laws that applied only to select departments, data, circumstances, or sectors.
CHAPTER 3: THE EVOLUTION OF THE COMPLIANCE LANDSCAPE
71
Post-GDPR and CCPA Compliance- In Practice
Although this updated conception of privacy compliance was compelling, all
interviewees noted this represented an idealized version more than reality. Part of the challenge
for practitioners was figuring out how to adapt comprehensive legislation to a sectoral system of
practice. Participants pointed out that what made organizational change so complicated was that
they were not simply updating definitions of compliance but rather are creating an entirely new
organizational culture, reality, and patterns of practice around data and ethics. Compliance under
a comprehensive model was perceived to be best accomplished when companies had a clear, up-
to-date understanding about rules and mandates and were proactive about addressing problems.
As previously discussed in Study 1, comprehensive laws like the GDPR and CCPA might be
clear on paper as to what constituted data and how it should be handled in theory, but things
could quickly become messy trying to figure out all the different rules data might be subject to,
when, and where since these could vary significantly by sector, region, and jurisdiction.
Participants described how this challenge was amplified under the sectoral model where
multiple sets of rules and organizational realities applied to a single set of data. Depending on the
context in which it was collected, certain types of data can be subject to extra requirements and
precautions around use, handling, and sharing as is the case in the financial and healthcare
sectors (Swire and Kennedy-Mayo, 2020). There was a consensus among interviewees that while
this had made sense in the past, the approach was quickly becoming outdated as data was being
moved, aggregated, transformed, and transferred in ways that eluded sector-specific legislation.
PRIVACY_8 gave an example of the confusion that can arise when comprehensive state privacy
law, in this case the CCPA, overlapped with sector-specific federal legislation like the Gramm-
Leach-Bliley Act (GLBA), which covered consumer data privacy in the financial sector. In
CHAPTER 3: THE EVOLUTION OF THE COMPLIANCE LANDSCAPE
72
particular he highlighted the elaborate carve outs written into the CCPA that were designed to
make the two laws compatible in practice by ensuring the CCPA does not interfere with GLBA
provisions yet still protected consumer data in situations when GLBA did not apply.
For the Gramm–Leach–Bliley exception, which is the financial services exception, the
definition for that exception is if the consumer gave you the data because they were
interested in receiving a financial product or service, then it’s exempt. It can even be like
‘hey here’s my email address, I just want to know more about your credit card’ or
whatever. That’s enough to make it exempt. But if it was collected through some third-
party where they were just administering a survey on Facebook and they were like, ‘hey
take our ten question survey’ and through that survey you gave them your email address
and the credit card company then comes to that survey administer (sic) who is a third-
party data broker or a lead generator and says ‘hey we want to buy 50,000 email
addresses from you’ that email address was not originally provided in order to receive a
financial product. So it’s regulated by the CCPA in the same way the health industry
entities may have received data to maybe supplement existing data they have or to market
to a new audience and that data may be subject to the CCPA.
You really have to ask what are the circumstances through which the data was
collected and what was the consumer’s expectation when they provided the data? Did
they understand that they were providing the data to get healthcare or to receive
information about a financial product? If they didn’t then you’re in trouble and it’s more
regulated. –PRIVACY_8
This was one example of the many different ways sectoral privacy laws conflicted with
comprehensive data privacy legislation in practice. Because it was not practicable for every
scenario to be laid out and codified into law, it was incumbent upon practitioners to develop their
own understandings and interpretations of these rules that ideally protect the company from
running afoul of regulators. Many practitioners spent significant amounts of time meticulously
researching laws, relevant cases, and learning from peers at other organizations to understand
thoroughly not only explicit provisions and exceptions of the law but also overarching context
for how the legislation fit into the existing regulatory environment. This more expansive purview
provided the foundation that informed how practitioners approached and developed company
privacy policies and strategies.
CHAPTER 3: THE EVOLUTION OF THE COMPLIANCE LANDSCAPE
73
In terms of translating these principles into practice, interviewees described engaging in a
relatively basic conception of compliance that revolved heavily around figuring out what laws
meant in a specific context for their individual company, identifying internal practices that could
increase risk of fines or lawsuits, educating and communicating risk to internal stakeholders or
clients, and helping position the company to develop data governance strategies that could
reduce vulnerabilities and corporate liability related to data. Interviewees routinely expressed
frustration over the difficulties of implementing a comprehensive approach to compliance while
operating under a sectoral system of practice. The tension between the different models of data
protection could be seen as certain values and practices contributed to perpetuating and
reinforcing certain sectoral structures, organizational realities, attitudes, and patterns of behavior
that resisted, undercut, and disadvantaged comprehensive privacy reform efforts within
organizations. According to interviewees, what made compliance so difficult was the lack of
infrastructure to support necessary changes that would allow companies to better follow through
on data mandates.
The Difficulty of Comprehensive Privacy Compliance in Practice
According to interviewees, to do privacy ‘right’ required an extensive amount of
resources, leadership backing, collaboration, and supportive infrastructure. This aligned with
previous research by Bamberger & Mulligan (2015), who studied CPOs at some of the most elite
companies around the world. They identified and outlined best practices that allowed these
businesses to move beyond the traditional compliance mindset by operationalizing privacy using
distributed networks to integrate privacy and data protection as a core value of the company.
This involved embedding teams of “dedicated privacy professionals and specially trained
employees within business units. These employees are empowered with practices and tools to
CHAPTER 3: THE EVOLUTION OF THE COMPLIANCE LANDSCAPE
74
help identify and address privacy during product design and in the early stages of business
development” (Bamberger & Mulligan, 2015, p. 13). While ideal, study participants explained
that many of these “best practices” were largely out of reach for the average SME in the United
States. Specific infrastructure that was lacking included 1) leadership buy-in, 2) organizational
awareness of privacy and data protection, and 3) collaborative structures.
Leadership Buy-In
Compared to research by Bamberger & Mulligan (2015), study participants noted that
leadership at SMEs were less progressive and further behind in terms of their understanding of
the US and international data landscape. Obtaining leadership buy-in was one of the most
important factors in successful organizational change yet it was one of the most difficult things
to do. The amount of work required to overcome competing interests was extremely high and
there was no guarantee of success. Most participants could recall at least one instance when they
received pushback from leadership who viewed privacy as an ancillary priority comprising a
distant set of actions that only applied in certain situations. Aside from a few specific instances
like approving budgets or responding to major crises in the news, privacy, for the most part, did
not factor into everyday business decision-making. PRIVACY_11, an attorney specializing in
strategic data privacy and cyber-preparedness compliance counseling at an international law firm,
described the privacy mentality of top leadership in companies.
They don’t know anything about [privacy]. I think what they do is they just defer to a
chief information security officer (CISO) or the digital transformation and AI team so the
only time they really hear about data is when they’re going to pay a bunch of money for
AI or machine-learning and they need to approve that or there’s a disastrous data breach
either in their own company, some other sister organization, or some other similar
industry. They call their CISO in and say ‘do we have the same problems?’ What needs
to happen is there needs to be continuous leadership. Data should come up at every
board meeting, just like any other asset of the company. –PRIVACY_11
CHAPTER 3: THE EVOLUTION OF THE COMPLIANCE LANDSCAPE
75
Interviewees mentioned receiving support from leadership when problems involved
cheap or easy fixes or if the issues were egregiously bad. However, outside of this, there was not
much interest in making privacy a core part of the business since leaders saw little value in going
above and beyond the minimum standards.
Stay With the Pack Mentality. Rather than looking inward and developing their own
strategies according to company need, participants described many business leaders opting for an
easier route, which involved looking at similar companies in the field and benchmarking their
strategies accordingly. PRIVACY_3, an attorney specializing in cyber law, data security, and
privacy, summarized this mentality of many business leaders:
A lot of clients are saying, let’s wait and see. We don’t want to stick our thumb out, we
want to do the bare minimum to be compliant, we want to look like we’re compliant but
we don’t want to spend a ton of money doing this because it’s costing a lot and we don’t
know what’s going to happen, and it’s not being enforced yet so they think have a little
bit of time which maybe they do, maybe they don’t. –PRIVACY_3
This approach was hugely problematic for advancing privacy compliance under a
comprehensive model because without companies internalizing privacy responsibilities, creating
their own benchmarks, and setting their own compliance goals, privacy policy became stagnant.
Despite the growing number of warnings from US policymakers, regulators, and legal teams
imploring companies to start taking consumer privacy more seriously, many companies seemed
to feel little urgency in tackling these problems. PRIVACY_8 elaborated on this ‘pack mentality’
and why it is so prevalent within leadership circles.
CHAPTER 3: THE EVOLUTION OF THE COMPLIANCE LANDSCAPE
76
So often the question from companies is ‘what’s everyone else doing?’ They don’t want to
know ‘what do I technically have to do?’ They want to know ‘what is everyone else
doing?’ and I just want to do that. They don’t want to be an outlier. They don’t want to
be doing way less than everybody else and they also don’t want to be doing way more
than everybody else. They want to just be in the pack and that’s the name of the game.
[This is partly due to] confirmation bias in companies. If someone tells them ‘you don’t
have to worry about employee data for CCPA’ they’re just like ‘okay, bye’ and they don’t
want to know anything more about it because it confirms what they’re hoping to hear
which is that they don’t have to worry about it. I think the bigger multinationals and the
companies that are used to being regulated, they don’t quite do that. They’re like ‘okay
well when do we have to worry?’ –PRIVACY_8
Companies with a history of regulation understood the writing on the wall and were using
the time to position themselves to prepare for how the field will evolve. However, for others this
‘pack mentality’ proved especially vexatious and especially troublesome for privacy practitioners.
As seen in Study 1, it caused many of the vital data protection mechanisms and accountability
measures to become watered-down as they failed to diffuse throughout the organization. Actions
became “performative” as companies obeyed the letter of the law by going through the motions
of what compliance was supposed to look like without actually adopting the provisions or
fulfilling the spirit of the law. Even though comprehensive approaches to privacy in practice
might work for companies like Apple and those in Bamberger & Mulligan’s (2015) study, it was
a hard sell for SMEs that had fewer resources and were either not able or willing to adapt their
business models without greater tangible incentives or threats of reprisal.
Keep It All Mentality. Another mentality that practitioners struggled to convince
leadership to move away from was the stockpiling of unused data collected by companies over
the course of regular business transactions. A key benefit of the sectoral model was that it
allowed companies to balance privacy against other values such as free speech and financial
interests, which pushed privacy considerations to the back burner (Pernot-Leplay, 2020). As
Swire and Kennedy-Mayo (2020) point out, one benefit of sectoral regimes is that they allow for
greater processing of personal data, “A different critique of comprehensive regimes is that they
CHAPTER 3: THE EVOLUTION OF THE COMPLIANCE LANDSCAPE
77
may provide insufficient opportunity for innovation in data processing. […] To the extent that
comprehensive laws may discourage the emergence of new services involving personal
information or require prior approval from regulators, the pace and diversity of technological
innovation may slow” (p. 60). This brought up questions about what it actually meant to ‘own’
data and whether companies have the right to use or benefit from data in their possession.
Because the US sectoral model does not explicitly outline privacy as a fundamental human right,
it provided grounds for businesses to argue they had a claim to use consumer data as an asset or
commodity (Pernot-Leplay, 2020; Peltz-Steele, 2012). The challenge for practitioners lay in
convincing company leadership that there was more value in deleting unused data than retaining
it. As PRIVACY_3 explained,
Some of [the companies] are looking at what they’re doing and saying ‘we didn’t realize
we were doing this, we’re going to stop and just get rid of the data.’ But that’s hard for a
lot of companies because traditionally the mentality’s been ‘just keep everything, you
never know what you’re going to use it for’ and I think that mindset’s going away quickly
because now it’s just a liability if you have all this data. So they are changing in that
regard but it depends on the company and what value they’re getting for it. –PRIVACY_3
While supportive leadership allowed privacy practitioners to exercise authority, mobilize
resources, and institute changes, many of the interviewees cited ambivalent leadership attitudes
as a major hindrance when carrying out privacy mandates. This shift away from the ‘keep it all’
mentality was not just about deleting data but closing the door on potentially valuable revenue
streams generated in the future. This is not an easy decision because it eliminated a possible
source of income or at the very least the ability to pitch investors and shareholders on the
potential of untapped revenue streams. Whether these sources of income actually materialized
was not necessarily important. Instead, the mere idea of closing off access to stockpiles of data
could put companies at a major disadvantage depending on their industry and business model.
CHAPTER 3: THE EVOLUTION OF THE COMPLIANCE LANDSCAPE
78
These concerns were heightened when there was not a level playing field and competition was
not held to the same standard.
The problem was that the EU’s conception of privacy as a fundamental human right
conflicted with foundational assumptions of Big Tech and technological infrastructures that
society had become dependent on. This ‘keep it all’ mentality was deeply rooted in society’s
understanding of technology and its role in driving innovation. Many dominant organizational
realities were built around the belief that big data and artificial intelligence would uncover
magical ‘needles in the haystack’ that could unlock secrets about the human condition and the
world. This had become a core value and assumption of business that continued to drive
technological advancement and inspire investors from Big Tech, venture capitalists, and startups
to invest in big data. These values were deeply ingrained in the data ecosystem and technological
infrastructures, which were designed and built to capture data indiscriminately, like deep-sea
fishing nets collecting whatever it can grab. The logic of amassing as much data as possible even
if there was no current use for it was driven by ‘assetization’ or data’s potential to be
transformed and gain value as an asset in the future (Beauvisage & Mellet, 2020). While the
‘keep it all’ mentality could be justified under the sectoral model as striking a balance between
privacy and business interests, comprehensive data protection approaches like the GDPR and
CCPA increasingly viewed stockpiles of unused data as liabilities which increased the likelihood
of data being compromised and posed unnecessary risks to consumers and organizations.
Treating privacy as secondary to business functions might have worked in the past, but it
was especially problematic when data was ubiquitous and every company is a ‘data company’
collecting, receiving, exchanging, and using data in some capacity or another – whether they
intended it or not. The inability to separate data from essential business functions combined with
CHAPTER 3: THE EVOLUTION OF THE COMPLIANCE LANDSCAPE
79
limited leadership buy-in made it difficult for practitioners to carry out mandates as they
navigated competing priorities, organizational realities, and institutional logics. However, all
eighteen interviewees pointed out that when done correctly, strong privacy programs actually
opened doors for companies to be more flexible, creative, and efficient in their use of data. The
more companies knew about their internal data processes and inventories, the more opportunities
they have to target specific areas of interest without worrying about violating privacy rules or
ethics around data. PRIVACY_10 provided an example of the mental shift that can occur when
companies begin to understand the scope of data in their possession:
If you don’t have a good privacy program, you can suffer what is called data reticence.
‘I’m afraid to do anything with the data because I don't understand how to mitigate the
risk and I don’t understand the rules.’ So you see the privacy professional’s role today is
not just about complying with the law, it may be to help liberate and create more uses of
data by ensuring it’s being done correctly. So you see there’s a counter point to that
which is again, if you’re doing the right things, you will probably find more ways to use
data, not less. –PRIVACY_10
Without management-backing to approach privacy in a more thorough and integrated
way, privacy practitioners were often forced to adapt comprehensive privacy mandates to fit
existing systems of practice. This could lead to more piecemeal efforts on the ground that
addressed the letter of the law but might compromise or miss out on the spirit of the law.
Low Levels of Organizational Awareness around Privacy
A related but somewhat separate challenge practitioners experienced when attempting to
set goals and drive more aggressive privacy change was the low levels of privacy awareness
among company employees. In order to be transparent and accountable about data, companies
needed to know what data was in their possession, what was being done with it, and if/how they
were disposing of it when it was no longer needed. Participants described their experiences
working with employees and departments to understand current data practices. Almost all
interviewees pointed out that this process was made exponentially more difficult because each
CHAPTER 3: THE EVOLUTION OF THE COMPLIANCE LANDSCAPE
80
person and department often had their own ways of documenting and using data. Employees who
had no previous experience complying with regulations often had trouble adapting their existing
workflows to accommodate more standardized procedures and mechanisms designed to keep
track of data. Over half of the participants stressed the importance of asking explicit questions
when assessing data practices since many employees did not realize all the different types of data
that must be accounted for.
PRIVACY_6 gave insights into the details of questions and framing used by many
practitioners as they attempted to gather necessary information required for privacy compliance.
Who are you getting the data from? Whose data is it? How are you getting it- so what
technology controls and security is there? How does it move through your organization?
Who touches it- because you have to tell people who’s touching it. It does include what
businesses outside the virtual four walls – I always try to refer to it that way because
people have a funny habit of answering only the specific question you ask them. So
sometimes you say vendor and they’re like ‘oh well they’re not a vendor, they’re our
partner.’ But you heard me telling you what businesses are you sharing this with? So I
always say ‘does it ever go outside the virtual four walls of your company and if so to
whom, how, and why?’ –PRIVACY_6
Another struggle practitioners faced was getting employees on the same page.
PRIVACY_8 provided an example of a struggle he had witnessed as an outside consultant
building a company’s privacy program:
I’ve walked into companies where the privacy or data person will tell you ‘we never
collect credit card or payment information’ and then you talk to the sales person or a
contract person and they’re like ‘oh yea, sometimes we’ll actually write down all that
information’ and they’ll store it in this database and you’re just like ‘oh my god’ because
nobody has asked the question. They just haven’t had that dialogue. It’s very hard to
come by as legal still. –PRIVACY_8
As these examples highlighted, the lack of employee awareness and experience dealing
with privacy issues complicated practitioners’ efforts to make employees more aware and
invested. Most employees had weak understandings about how data mandates applied to them
and the role they played within their company’s privacy compliance process. While all eighteen
CHAPTER 3: THE EVOLUTION OF THE COMPLIANCE LANDSCAPE
81
participants agreed trainings were important, four interviewees noted that trainings alone were
not enough. Even though company-wide trainings were useful for teaching employees what
privacy was and why it was important, they had limited utility when it came to helping
employees translate these concepts into practice. Borrowing a concept from educational literature
known as Bloom’s Taxonomy (a model that hierarchically classifies educational learning
objectives based on complexity and specificity), most company-wide training programs can be
described as focusing on lower-level learning processes like recalling knowledge and
understanding information. Helping employees reach higher levels of learning would allow them
apply these mandates to their everyday workflow, analyze their effectiveness, and ideally suggest
improvements for how privacy could be better incorporated throughout business workflow and in
turn the organization. Similar to Bamberger & Mulligan’s (2015) networked privacy best
practices, this approach required strong leadership support, a tremendous amount of time,
resources, energy, and investment that many companies either cannot or were not willing to
commit.
Collaborative Structures
Like many companies around the world, organizational structures in the US tended to be
hierarchical. Dividing the organization into sub-units might be helpful for streamlining business
function but posed problems for practitioners as they attempted to build a collaborative culture
around privacy. Nine practitioners pointed out that coordination across departments had only
become more important since the introduction of comprehensive legislation in 2018 and has
since been amplified by the Covid-19 pandemic. This distributed responsibility for data has not
only necessitated extended cooperation between different divisions of the company but also
started to blur the lines between job-related responsibilities and the sector-specific laws that
CHAPTER 3: THE EVOLUTION OF THE COMPLIANCE LANDSCAPE
82
applied to each department. The sectoral approach of having each division or group be
responsible for a specific area or task within the privacy chain was compromised because
different rules could apply to the same data depending on the context. Advances in technology
further obscured the boundaries between sectors, rendering the traditional sectoral boundaries
largely moot (Nahra, 2016). Without top-level executive buy-in, practitioners struggled to
establish credibility and legitimize more ambitious initiatives when management was not fully
onboard.
Lack of Authority. The lack of authority to mandate or incentivize collaboration across
departments was related to social position within the company. Interviewees pointed out that
privacy practitioners were often placed at the periphery of their organizations or nestled within
departments that were the biggest targets of change. This metaphorical distance could make it
difficult for practitioners to bridge boundaries and encourage collaboration between departments
when their role was constantly being redefined and renegotiated depending on their location
within the organizational structure. PRIVACY_10 explained:
So the question then becomes what role do you play in the organization and who do you
answer to? If you work for the compliance office or general counsel, it’s a little bit easier
because you’re somewhat outside of it. If you respond through the CIO [Chief
Information Officer] or if you’re working in say a direct advertising context -you’re
answering to the Chief Marketing Officer- there will always be this pressure to approve
things. […] It is a tough road sometimes because you will have organizations say ‘yeah
we need a privacy officer but we’re not really going to pay attention to what they say.’
Those people don’t tend to retain their privacy officers very long. –PRIVACY_10
Part of the challenge was that dominant organizational realities saw privacy work as cost
centers as opposed to revenue generators. Interviewees pointed out that leadership played a key
role in the adoption of privacy reforms. One challenge of diffusing privacy norms throughout an
organization was that privacy was not confined to a single department and there were few pre-
CHAPTER 3: THE EVOLUTION OF THE COMPLIANCE LANDSCAPE
83
existing internal structures that promoted accountability. PRIVACY_6 explained how leadership
could begin to change the culture:
When you have leadership, every level of the leadership from the lowercase l to the
uppercase L, when they are all talking the talk and walking the walk and demonstrating
the importance, I think then it slowly starts to change but it’s hard. It’s extra work or it is
a divergent task or it is also as I was saying before with legal and tech, it can be
[inaudible], it can be budget, it can be all sorts of internal territorial wars that break out.
But having some accountability also having it fit in a well-defined place so that there’s
another area where leadership can step in. Is it going to live in legal? Is it going to live
in IT? Do you have a risk or audit function, should it be in there, maybe even not so that
you can have an internal (to the organization) entity to assess. It can be done by
committee but ultimately there needs to be an appointed point of contact that has
ownership. –PRIVACY_6
One theme that emerged from the interview data was territoriality related to internal
tensions between divisions.
Territoriality. Generating internal support to create accountability structures was difficult
because it introduced the possibility of liability. Despite presenting a unified image to the public,
practitioners stressed that companies were not monolithic and routinely experience infighting
between departments such as IT and security, legal, marketing, HR, or privacy. Interviewees
described instances of their work getting pushed aside and treated as a suggestion rather than an
imperative because it conflicted with existing organizational realities and established workflows
and patterns of practice. One popular example of territoriality occurred when updating language
in privacy notices. Interestingly, the two departments repeatedly described as conflicting with
one another were privacy and legal. Most interviewees mentioned encountering pushback when
trying to move privacy notices away from risk mitigation frames to more user-friendly
communication. As one CPO at a major university noted,
There are some laws that have attempted to get companies to be more transparent and
actually tell people what you do. But then lawyers [points at self ironically] word the
thing and put in all their legalese, their ‘notwithstanding but foregoing’ etc. They use
their language to protect the company because that’s their privacy counsel. They’re there
to protect the company, they’re not your lawyer as an individual. –PRIVACY_2
CHAPTER 3: THE EVOLUTION OF THE COMPLIANCE LANDSCAPE
84
PRIVACY_6 also recalled her experience trying to infuse privacy norms into everyday
practice without stepping on toes:
I’ve even personally come into conflict with outside counsel where I’ve drafted a privacy
notice that ticks all the boxes, did not put the organization in any kind of risk but adopted
a much more transparent method of open communication. The use of bullet points, trying
to use common terms for things, likening the information to very specific things [to]
trigger the memory of when I would have given you that information so fulfilling the
letter, spirit, and intent of the law […] the lawyers have gone in, rewritten it, and made it
much more legal-y, taking out parts that I’ve written. I actually had one law firm that
kept insisting on putting excerpts of the CCPA’s definition of personal information into
this privacy notice but the company wasn’t collecting half of the data listed there but they
still put it all in there. At some point I just bowed out and that goes back to the challenges
and land grabs between IT and legal even on outside consultancy. Your outside experts
are going to have some territorialism also. –PRIVACY_6
Making privacy policies more user-friendly was not just about translating documents into
plain language but also involved adjusting how companies perceived privacy and responded to
liability. The existing norm of notice-and-consent was built upon longstanding tenets known in
the privacy community as fair information practices (FIPs) or fair information practice principles
(FIPPs) (Swire and Kennedy-Mayo, 2020). This practice of protecting a company by shrouding
it in a cover of legal jargon represented an old institutional logic that was incompatible with new
comprehensive data protection laws and consumer expectations. According to participants, for
some employees (particularly in legal departments) removing this ‘security blanket’ constituted
the removal of their main line of defense against lawsuits, which caused tension. PRIVACY_3
explained how this shift affected his approach:
One of the huge challenges I have in counseling clients is we’re required to evaluate
what potential impact or harm could befall a consumer if their data is exposed and that’s
really hard to anticipate. How someone could use data in a way that I don’t foresee is
really hard to do. You go through the exercise, you do the best you can but you can be
wrong. Is that going to come back to haunt you if you get it wrong? If you’re a business
that makes car parts and you’ve got a lot of client data that you give to Google or some
other analytics company, they’re going to use it in ways you don’t intend even if they say
they will or not. There’s a lot of challenges for the businesses but there’s a lot of
potential harms to consumers too, so it is an approach to liability. As a society we’re
going to have to weigh the cost and benefits to each. –PRIVACY_3
CHAPTER 3: THE EVOLUTION OF THE COMPLIANCE LANDSCAPE
85
This aligned with research by Waldman (2018), who pointed out shortcomings in the
traditional notice-and-choice approach, “the dominant rights-based notion of privacy, or the idea
that privacy is about giving users choice and control over the dissemination of their data, reduces
corporate privacy obligations to posting privacy policies. Any ambiguity as to how to
conceptualize privacy among those that recognize that privacy can mean different things to
different people at different times makes it difficult for practitioners on the ground to turn theory
into practice” (p. 664). For interviewees, the act of updating privacy policies represented more
than simply expanding reader comprehension but charting a new path forward for their
organization. Although participants admitted it would be easier to change the wording of their
documents, they stressed the importance of building a culture around data privacy. Part of the
problem was that practitioners lacked the authority to make more comprehensive changes within
the organization that could address some of these shortcomings or gaps in system because of fear
it would leave companies vulnerable to lawsuits.
Overcoming Resistance. When asked how resistance to privacy reform could be
overcome without the backing of executive leadership, practitioners described relying on
interpersonal communication to build buy-in from the bottom up. Practitioners described
working with department heads and mid-level managers to convince them that collaboration was
in their best interest due to the changing regulatory landscape and growing threats to data. This
was best accomplished when practitioners adopted a pragmatic approach that incorporated
multiple perspectives and emphasized the utility of preemptive action so departments could
operate more efficiently and avoid the need for drastic overhauls in the future. PRIVACY_6
provided an example of how this could be achieved.
CHAPTER 3: THE EVOLUTION OF THE COMPLIANCE LANDSCAPE
86
I am a humongous advocate of training because again IT can build the absolute, drop-
dead best security and do all the right things in terms of having set up databases. Legal
can write all the best policies and all you need is the business to go in there and use them
for a different purpose. If you’ve got a database field and it's a free text box, I could use
it for whatever the description is or if I’m in the business and I need to add other info and
it's a pain in the butt to go to IT and get it added or I don’t have time, I can just put other
stuff in there. But now none of the security is designed to cover what I just went rogue
with nor are any of the policies. So training, open communication, making sure that
employees have an awareness and with that awareness a buy-in that it’s important to the
organization just from an ethical perspective. –PRIVACY_6
Although a top-down approach with full executive buy-in was preferable, practitioners
were still able to exercise some amount of agency introducing changes through bottom-up
approaches in less than ideal conditions. Nevertheless, interviewees pointed out that unless
leadership and employees embraced privacy as a fundamental part of business any changes were
likely to be surface-level and short-lived.
At the end of the day, it’s about what it takes for any organization, large or small, to
actually get this off the ground and get it sustained. We can bring in a consultant and we
can crack the whip and get everybody for the 3, 4, 6, 10 weeks that we’re onsite or we’re
there holding folks accountable we can get things done but in order to maintain that
momentum, there has to be that buy-in and that commitment from folks who are actually
working there to say ‘yea, I’ll carve out time from my day in order to fulfill my
responsibilities to privacy and to what the work is asking of me.’ –PRIVACY_6
CHAPTER 4: CYBERSECURITY IN PRACTICE
87
Chapter 4: Cybersecurity in Practice (Study 3)
As society moves toward an increasingly digital and interconnected future, cybersecurity
has become one of the most pressing issues around the world, impacting all levels of government,
business, and vital infrastructure. Although concerns about the state of cybersecurity have always
been present, the explosion of successful attacks against individual users all the way up to global
superpowers has galvanized renewed attention and urgency across the public and private sector
to address underlying problems and vulnerabilities.
A key priority for the US Government is to raise awareness about electronic
vulnerabilities while shoring up US defenses and resilience to attack (Exec. Order No. 14028,
2021). The Biden Administration’s approach to cybersecurity represents a new direction for US
policy, highlighting the need for increased public and private sector collaboration to protect
critical infrastructure (National Security Memorandum, 2021). A key aspect of this plan is to
help private industry strengthen its cyber-defenses since “much of our domestic critical
infrastructure is owned and operated by the private sector, and those private sector companies
make their own determination regarding cybersecurity investments” (EO 14028 Fact Sheet, 2021,
para 2).
Traditionally, the majority of cybersecurity research and strategy has focused on securing
technical infrastructure and preventing unauthorized access to information (NIST, 2021). Human
error has been consistently cited as the top security threat to business for years (IBM Security
Services, 2014; van Zadelhoff, 2016) with up to 85% of breaches in 2020 involving a human
element (Bassett, Hylender, Langlois, Pinto, & Widup, 2021). One of the fastest growing areas
of research is human factors in cybersecurity. This body of research focuses primarily on low-
CHAPTER 4: CYBERSECURITY IN PRACTICE
88
level individual behaviors such as promoting cyber-hygiene and teaching users to avoid phishing
scams, loss of hardware, etc. However, far less attention has been paid to higher-level human
factors in organizations such as the role of management, leadership decision-making, and the
influence of organizational culture on how security gets prioritized in everyday operations.
Efforts to understand the larger culture and collective practices that constitute, enable, and
reinforce current approaches to cybersecurity are still few and far between.
This chapter sought to expand the study of human factors within cybersecurity to include
the role of group-level dynamics and high-level actor decision-making behavior. In particular,
this research was interested in studying the routine, taken-for-granted assumptions and collective
practices that occurred within organizations that impacted how security was implemented in
practice. It was especially concerned with understanding industry norms, values, incentive
models, and patterns of behavior that often were overlooked when studying humans and
cybersecurity.
Approaching this from a communication perspective, a key piece of the puzzle missing
from the field has been an emphasis on group-level dynamics and interactions. One reason why
cyberattacks continue to be so challenging has been the lack of understanding about how
cybersecurity gets implemented in practice. This generation of attackers has been so effective not
only due to their ability to infiltrate technical systems but also their understanding of weaknesses
across the cybersecurity landscape that have allowed them to exploit well-established human-
centered vulnerabilities within organizations (Dudley & Golden, 2021). A prime example of this
would be the Colonial Pipeline hack wherein an outside audit found evidence of “atrocious”
information management practices and “a patchwork of poorly connected and secured systems”
up to three years before the actual attack (qtd. In Bajak, 2021). No amount of technology, audits,
CHAPTER 4: CYBERSECURITY IN PRACTICE
89
or cyber hygiene training could ever make up for collective inaction like the decision not to
address system-wide vulnerabilities in a timely manner. Investing in tools like AI, advanced
detection systems, and cyber-insurance can offer some level of reassurance to organizations by
putting a Band-Aid on the problem, no tool in the world can fix an organization’s cybersecurity
if high-level management is not on board.
Despite prominent discussions taking place in government and across the private sector
about how to improve cybersecurity, there is often a key perspective missing, that of
cybersecurity practitioners. These are the people with in-depth knowledge and first-hand
experience dealing with long-term, pervasive problems facing the field. Not only can they
provide insight about how organizations are currently protecting data but they can also speak to
the organizations’ –and in turn the industry’s – capacity and capability for change. Unfortunately
this expertise is often not widely shared outside the security community or, if it is, it does not
receive the attention it deserves. By documenting the challenges faced by cybersecurity
practitioners on the ground, this pilot study begins to can facilitate greater understanding,
communication, and collaboration between the cybersecurity community and key stakeholders
such as policymakers, business leaders, and the public. The goal was to use this research as a
vehicle to provide those outside the security community with a more holistic view of the
cybersecurity landscape.
Using semi-structured interviews with nine cybersecurity practitioners, this pilot study
explored some of the challenges faced by cybersecurity professionals as it sought to understand 1)
why these problems existed and 2) what could be done about them. This chapter first sets the
stage by discussing how this study’s approach builds on existing human factors in cybersecurity
research. Next, it provides a brief overview of methods followed by results. The results section
CHAPTER 4: CYBERSECURITY IN PRACTICE
90
suggests that improving cybersecurity within organizations required overcoming strongly held
beliefs and practices that were misaligned with enhanced cybersecurity postures. Thematic
analysis identified problematic incentive models as a fundamental misalignment that occurred
within organizations. In particular, interviewees challenged the assumption that organizations
cared enough about cybersecurity to prioritize it in everyday practice. They argued that decisions
to place business interests ahead of security were not isolated, one-off incidents but regular
occurrences throughout the private sector, pointing out that when given a choice many
companies would opt for weaker products, policies, and practices. While the increase in high-
profile cybersecurity events in the news has generated greater awareness among company
leadership about possible threats, interviewees noted that this concern is unlikely to translate into
better overall security because most company leaders do not have a good understanding of
cybersecurity or risk factors within their organizations.
Literature Review
Human Factors in Cybersecurity
Research on human factors in cybersecurity has been primarily concerned with studying
the individual-level unit of analysis, which prioritizes user-centered perspectives (Jeong,
Mihelcic, Oliver, & Rudolph, 2019; Poehlmann, Caramancion, Tatar, Li, Barati, & Merz, 2021;
Sasse & Rashid, 2021). In general, human factors research attempts to understand how humans
interact with their information security environments including networks and information
systems (Nobles, 2018; Rahman, Rohan, Pal, & Kanthamanon, 2021). Sasse & Rashid (2021)
note that security failures are primarily a result of lack of usability and inappropriate fit.
“Security measures are not adopted because humans are treated as components whose behaviour
can be specified through security policies, and controlled through security mechanisms and
CHAPTER 4: CYBERSECURITY IN PRACTICE
91
sanctions. But the fault does not lie primarily with the users, as suggested by the oft-used phrase
that humans are the ‘weakest link’, but in ignoring the requirements […] that security needs to be
usable and acceptable to be effective” (p. 3). According to Gale (2021) employees are often
identified as the ‘weak link’ in organizational cybersecurity due to convenience almost always
being chosen over security. However, he argues the burden of responsibility weighs
disproportionately heavy on employees and adds to a chorus of experts calling on companies to
take a more proactive role in managing employee instincts as part of defensive structures. One
area of this research is known as organizational cybersecurity.
Organizational Cybersecurity
Organizational cybersecurity draws on a number of literatures including those in
psychology and management to emphasize the importance of understanding organizational
responses, employee perceptions, and experiences related to cybersecurity (Dalal, Howard,
Bennett, Posey, Zaccaro, & Brummel, 2022; Dreibelbis, Martin, Coovert, & Dorsey, 2018;
Jeyaraj & Zadeh, 2020; Tejay & Klein, 2021). The goal is to help secure an organization’s data
and information systems by understanding and then guiding user and organizational behavior
toward desired outcomes (Collins & Hinds, 2021; Jeyaraj & Zadeh, 2021; Kalhoro, Rehman,
Ponnusamy, & Shaikh, 2021). The majority of attention focuses on how organizations can reduce
risk by designing more usable security systems and developing interventions including
models/frameworks (Clark, Espinosa, & DeLone, 2020; Renaud & Ophoff, 2021), programs
(Reagin & Gentry, 2018), and trainings (Poehlmann et al., 2021) to promote positive behaviors
while simultaneously discouraging actions that might destabilize networks and systems (Huang
& Pearlson, 2019; Sasse & Rashid, 2021).
CHAPTER 4: CYBERSECURITY IN PRACTICE
92
Cybersecurity Culture
Cybersecurity culture is also discussed in relation to how management can develop wider
awareness and compliance throughout the organization. A key aspect of cybersecurity culture
and risk governance is the communication of risk to the workforce. This is done by stressing
workforce education, training, confidence, and involvement (Burnap, 2021). According to
Nobles (2018) effective information security culture should be treated as a vital part of business
strategy. ‘Enlightened security cultures’ rely on policies that value and support distributed
responsibility for security by empowering individual employees to make choices that infuse
security into everyday behaviors. These approaches are achieved by developing programs that
reinforce cyber principles and practices to facilitate employee adherence to security plans and
behaviors (Reegård, Blackett, & Katta, 2019).
However, despite years of research many companies are deficient when it comes to
cyber-readiness and cyber-resilience. According to McKinsey & Company, approximately 70%
of organizations surveyed have yet to achieve advanced levels of security management required
for the current business environment (Eiden, Kaplan, Kazimierski, Lewis, & Telford, 2021). It is
important to point out that despite the deluge of current security attacks occurring everyday, this
environment is neither surprising nor unexpected. While specific techniques used to infiltrate
networks and organizations might be novel, many structural vulnerabilities arise from continued
use of legacy systems and other long-standing problem areas (Charette, 2020). For the most part,
older technologies are not able to withstand the current level or sophistication of attacks, thus
forcing practitioners to retrofit or adapt legacy systems. This reactive approach might work in the
short term but does little to address the overarching problems and threats that continue to plague
the industry and leave companies susceptible to new attacks over time.
CHAPTER 4: CYBERSECURITY IN PRACTICE
93
Building on human factors in cybersecurity literature, this research attempts to
understand why organizational cyber-maturity continues to lag despite concerted efforts made by
companies and researchers to improve cybersecurity. This chapter took a closer look at how
cybersecurity was practiced on the ground within organizations from the perspective of
cybersecurity practitioners. The goal was to expand the study of humans in cybersecurity to
include higher-level actors and perspectives that have been understudied in the literature. By
understanding factors that impeded the development and practice of cybersecurity within
organizations, this study added nuance and context to ongoing discussions about how to meet the
demands of current cybersecurity environment. Research questions are as follows:
RQ1: What challenges do cybersecurity practitioners encounter while securing
organizations?
RQ2: Why do these challenges exist?
RQ3: How should these challenges be addressed?
Method
Data Collection
Interviews
Although privacy and cybersecurity often overlap, they are distinct fields of practice with
security seen as much more technical and less concerned with abstract theory compared to
privacy. Each discipline had its own history, perspectives, approaches, dilemmas, conflicts, and
cultures that were specific to the domain and could not be understood as a conglomerate. This
research recognized this divergence and set out to explore the unique issues and narratives within
cybersecurity that were separate from privacy.
CHAPTER 4: CYBERSECURITY IN PRACTICE
94
This pilot study utilized a subset of interviews from the main corpus of data. Data was
drawn from the nine interviews conducted with information security practitioners from across
cybersecurity domains including in-house information security (e.g. chief information security
officers), a business executive responsible for allocating resources for security activities (i.e.
chief information officer), outside security consultants, security researchers (e.g. penetration
testers, auditors, and white-hat hackers), a privacy and cybersecurity attorney, and a
cybersecurity reporter, between 2020 and 2022. Interviews were chosen for this exploratory
study because unlike questionnaires and surveys, interviews allow researchers to delve into
individual experiences, beliefs, and attitudes (King, 1994). Questions from the in-depth, semi-
structured interviews probed individual practitioners’ knowledge, motivations, perceptions, and
experiences about 1) current approaches to cybersecurity, 2) challenges encountered protecting
organizations and data, 3) why these problems existed, 4) and what could be done about it. The
primary focus was to get a better sense of how current cybersecurity systems worked by
highlighting the experiences of practitioners. Interviewee demographics specific to Study 3
including cybersecurity domain, employment sector, job title, and years of experience can be
found in Appendix 2.
Qualitative Analysis
Interviews were coded using selective open coding and analyzed using thematic analysis.
The first phase involved identifying emergent patterns while an iterative, inductive approach was
used to identify reoccurring themes and distinguish patterns within transcripts (Charmaz, 2006;
Strauss & Corbin, 1990; Murphy, Dingwall, Greatbatch, Parker, & Watson, 1998). See Study 1
Qualitative Analysis section for more in-depth information about qualitative analysis methods. A
conceptual map of themes can be found in Figure 3.
CHAPTER 4: CYBERSECURITY IN PRACTICE
95
Figure 3
Study 3 Themes
Results and Discussion
Results of the pilot study highlighted institutional resistance to cybersecurity practitioners
as they attempted to better secure their organization’s networks and information. Organizational
culture, including collective attitudes and patterns of practice, reinforce traditional norms and
values that served to undermine efforts to improve cybersecurity. Interviewees discussed
misalignment as a fundamental problem plaguing the field of organizational cybersecurity,
leading to problematic incentive models that treated cybersecurity as secondary to business.
Findings suggested that although high-level leadership purportedly cared about
cybersecurity, the decision to prioritize business over cybersecurity was not limited to a handful
CHAPTER 4: CYBERSECURITY IN PRACTICE
96
of organizations but was a pervasive and normalized expectation in the field. Interviewees cited
the industry’s reluctance to increase user friction as a major barrier to improving security, which
severely limited the adoption of stronger policies and practices. Current patterns of practice
discouraged additional verification of third-party companies such as vendors and even security
testing companies. The inability to independently test the security standards of third-party
vendors forced practitioners to make decisions about granting access to systems based on very
little information –usually assurances from the third party itself. Participants pointed out that the
lack of accountability in these relationships could often leave their networks and organizations
exposed and vulnerable to attack.
A major reoccurring theme was the reality that high levels of cybersecurity were not
required –and often not even desirable- when running a business. Every interviewee was careful
to specify that this is not the case for every company and there were indeed some organizations
that did implement security effectively. However, they also stressed that in their experience, this
was not the case for the majority of the field and many companies can and do get away with
doing far less than the bare minimum security standards require. Participants were unanimous in
their opinion that real, meaningful change would only happen when catastrophic security events
occurred that are too big to ignore. This was partly due to lack of accountability and
repercussions throughout the field that allowed inaction by companies to continue over long
periods of time.
While cybersecurity awareness has been increasing among high-level decision-makers,
interviewees stated that the new challenge has become helping leadership understand and adapt
to the threat landscape. This required organizations and top-level management to evolve their
understandings about what it meant to be secure and what constituted their greatest risks within
CHAPTER 4: CYBERSECURITY IN PRACTICE
97
the current environment. In terms of organization, this section first discusses the current state of
cybersecurity followed by misalignment in incentive models. Misalignment was broken down
into three sections including 1) treatment of cybersecurity as a secondary priority to business, 2)
the role of user friction, and 3) the inability to verify and hold third parties accountable.
Current State of Cybersecurity
When asked about the state of cybersecurity, every interviewee noted that conventional
understandings about data and what it meant to be secure were a thing of the past. Years of data
proliferation and lax standards had fundamentally changed the data landscape. Traditional
assumptions about data were no longer applicable as data had been shared many times over and
once it was out, it was almost impossible to retrieve. Housing data was extremely difficult and it
was only a matter of time before a breach or compromise took place. According to participants,
practicing cybersecurity in today’s world basically means existing in a state of insecurity and
uncertainty. It was no longer a question of if an organization would get hacked, but when and
how. The best thing to do was prepare and try to mitigate and manage threats but nevertheless
there would be attacks coming. DATA_2, a chief information officer with more than thirty years
experience in industry and higher education explained this logic:
The bottom line is, it’ll never be perfect, you don’t have to be perfect, you just have to be
faster than the slow guy. […] If you keep your security just a little better than the average,
it’s just easier for the bad guys to go after the other data but it’s not secure by any stretch.
If they want it, they can get it. It’s just that there’s easier pickings out there. Make sure
when you’re running from the bear, you’re running with somebody whose slower than
you. –DATA_2
This mentality was incredibly difficult for leadership to come to terms with, however
interviewees stressed management perceptions around data must evolve if there was to be any
improvement in cybersecurity. An offensive security specialist with more than a decade of
CHAPTER 4: CYBERSECURITY IN PRACTICE
98
experience talked in-depth about some of these complexities and the mentality needed to move
forward in the current environment.
There’s a reason we haven’t solved this problem [of computer security] and it’s because
there are significant roadblocks that prevent it from being effectively solved for a lot of
organizations. Even the ones who do it right, they still make mistakes. That’s the hardest
part, you can do everything right- and that’s why GDPR doesn’t just fine everyone who
gets hacked- because you can do everything right and you can have the best of intentions
and you can still get hosed and it happens. It happens to [a lot of] organizations […]
Housing data, housing data securely, and keeping a good inventory on data- that
gets really complex really quickly especially when you’re dealing with lots and lots of
different systems that all have different designs, intents, and purposes. It’s a hard
problem and as much as I’m sitting here bashing these companies, I do get that there’s
difficulty and complexity involved but you have to be willing to actually try and solve a
problem in order to solve it. If you have no interest in actually solving it, it’s a non-
starter. –DATA_9
Cybersecurity Not a Prerequisite for Business
When asked how cybersecurity had reached this point there was full consensus among
participants that good cybersecurity was not a prerequisite for running a business. However,
making money was always a requirement regardless of industry. While there were obviously
companies who bucked the trend, for the most part, interviewees stressed that many companies
across the private sector had either bare minimum or below minimum security standards. Despite
promises and commitments to improve, there simply was not enough incentive to change. There
was unanimous agreement that although some techniques were new, the majority of exploits
stemmed from long-standing and well-known problem areas. DATA_9 explained:
Once again, I’m super cynical about a lot of this stuff but that’s just because I’ve been
doing this for a decade. These aren’t new problems. These organizations that are getting
hacked, it’s not like these people are coming up with new, creative strategies a lot of the
time. A lot of this stuff are the same. It’s the same shit people were talking about in 2006,
it’s the same shit people were talking about in the year 2000. It’s the same core logic
issues, it’s just that they’re being adapted to new frameworks, new languages, and new
types of infrastructure. But the core logic and the core vulnerabilities have not changed
considerably. –DATA_9
CHAPTER 4: CYBERSECURITY IN PRACTICE
99
Lack of Repercussions and Accountability. According to interviewees, one reason why
problems have been allowed to continue largely unabated for so long was due to the lack of
interest but also lack of repercussions and accountability across the cybersecurity field. With so
many companies continuing to get away with doing less than the bare minimum, companies had
little incentive to improve their security standards beyond their own internal perceptions of
responsibility. Participants pointed out experts had warned for years that Equifax was a ticking
time bomb due to the amount of highly personal data collected combined with very lax security
standards, yet no major action was taken. All three security researchers talked at length about
how frustrating this type of inaction could be especially when companies repeatedly failed to fix
or improve vulnerabilities. For example, DATA_11, a threat research engineer who has worked
for a multinational technology conglomerate for the last 17 years, stated:
What my experience is that there's a tremendous amount of inertia that people have
where an object at rest will stay at rest kind of thing. They'll nod their heads and then
they'll go home and it's kind of like in one ear and out the other. Like I was saying before,
now that you've got ransomware attackers that are out there actually encrypting people's
machines it’s ‘Oh, maybe I should be thinking about making backups.’ –DATA_11
DATA_10, a veteran hacker and security consultant with over thirty years experience in
computer security, also detailed his experiences repeatedly being ignored and pushed to the side
when it came to improvements.
I did security reviews and incident response for twenty years for different companies
helping them respond to hacker attacks or looking at the security defense plans. What I
was amazed by was I would tell them this is what you need to fix you need to patch
[issues with the programming language] JAVA. You’ve been compromised because you
didn’t patch JAVA three times, an advanced persistent threat got in here etc. I would tell
them that, everybody would nod and then I’d come back in six months, they didn’t patch
it. Come back in a year, they didn’t patch it. Let me say, this was every single company
but one over the years […]
It really bothered me so much and I’m surprised I didn’t get fired because I was
like ‘you people aren’t fixing this, I’m going to go to your CEO and your board of
directors and I’m going to tell them I told you what to fix and you’re still not fixing it and
you’re being broken into again.’ I said ‘I don’t understand why you’re not fixing it.’ I’d
CHAPTER 4: CYBERSECURITY IN PRACTICE
100
get mad, I’d get angry. They still wouldn’t fix it. I actually got walked out of a couple
boardrooms because I was starting to get so –I just didn’t understand. –DATA_10
This mentality seemed common up and down the organizational hierarchies and within
the cybersecurity field itself. DATA_9 described the demoralizing experience presenting detailed
reports at board meetings only to realize the company was more interested in the reputation of
the testing company than actual results. Even within the security testing space, the goal was not
necessarily to do good work but to make money.
What they want to know is which company logo is on the front page. Most of those
reports, the report could be 400 pages, there is not a single person at that board meeting
that is going to read past page 4. Ever. What they want is to ensure that [they’ve covered
their backs]… no one ever got fired for hiring IBM. No one ever got fired for hiring PwC,
Ernest & Young, Deloitte, Accenture, or any of those companies. It doesn’t mean they’re
good at what they do, it just means you know you’re not going to get fired for hiring them
[…] I guess I could be considered to be pretty cynical on a lot of this stuff but I’m just
going off of what I’ve seen in industry and what I’ve seen of this space. It seems like the
motives of these businesses that are doing this security work predominately isn’t to
actually deliver good work, it’s to make money. –DATA_9
As these experiences showed, companies can routinely go through the motions and look
like they were engaging in best practices without actually having to do much work. Participants
argued the industry norm of demonstrating outward compliance without much expectation of
follow-through severely undermined the effectiveness of vital accountability measures. In a
sense, security audits and penetration tests could become “performative” as a way for top
management to signal to outsiders that they had checked all the boxes and were compliant with
industry standards. However, it did not mean they had necessarily made any of the recommended
changes to ensure internal security (which was typically expensive and highly disruptive to
business).
Barriers to Prioritizing Cybersecurity in Everyday Practice
While on the surface it might have seemed as if there was a lot of animosity between in-
house and outside security teams, in reality everybody seemed upset about the same thing –the
CHAPTER 4: CYBERSECURITY IN PRACTICE
101
inability of organizations to prioritize cybersecurity in everyday practice. Frustration about the
lack of progress in cybersecurity as a whole was felt across interviews and domains. In-house
experts were keenly aware of and concerned about these problems but were largely resigned
about whether fundamental change would happen given the pressures they faced from their
organizations and the industry at large. While legislation was praised as a step in the right
direction, most did not expect it to have a significant impact on day-to-day operations because it
would be implemented too slow and inevitably lag behind the field. They explained that because
most legislation tends to be reactive and revolve around codifying basic minimum standards, the
organizations that valued cybersecurity enough to make changes voluntarily were probably
already operating well above baseline regulations. The organizations that would be affected were
the ones who would need more help and incentive to change than a single law or mandate could
provide. DATA_2 explained that while legislation could provide the ability to prosecute people
after the fact, it was still largely reactive in nature, “I don’t think [legislation] will stop anything.
I think what it will do is allow you to punish people who get caught doing it. But there’s so much
money in it, they’re going to do it anyway. And so much of this is not done here, it’s done
offshore. So what are you going to do when someone in the Ukraine or Asia is doing this?”
For all participants, a more pressing issue than legislation was getting leadership and
high-level management onboard to make the necessary changes required to better secure their
organizations. Every interviewee was adamant that meaningful change would only occur when
“apocalyptic” security events severely disrupted peoples ability to go about their daily lives. For
the most part, interviewees were hopeful that the threat of fines and reputational damage related
to poor cybersecurity practices would motivate companies to take security more seriously.
CHAPTER 4: CYBERSECURITY IN PRACTICE
102
However, one participant stood in stark contrast to the rest, suggesting that maybe the
cybersecurity field had just been approaching this incorrectly. DATA_10 explained:
The problem is that sometimes security people, like myself, maybe we care too much. We
think that ‘boy you have to protect your company because if this information gets
compromised then it’s really going to harm this company. There are many companies
that got fined up to hundreds of millions of dollars, CISOs got fired, etc. But the vast
majority of companies that get a really bad compromise like Equifax or whoever it is,
within a year their stock price is above where it was a year ago. The reality is that no
matter how bad things are, the vast majority of companies survive. So perhaps it’s me
and all the other computer security people that don’t understand that the security isn’t
actually that important. –DATA_10
As difficult as it is to accept, perhaps DATA_10 is right. Given the current state of
cybersecurity and the numerous problematic incentive models, how realistic was it to expect
organizations to willingly invest and overhaul their operations when there were so few benefits
and repercussions? Interviewees spoke at length about some of the incredibly difficult
considerations, obligations, interests, incentives, and constraints that were not often
acknowledged or discussed in high-level and mainstream conversations.
Problematic Incentive Models: Cybersecurity as Secondary to Business
Participants stressed that in order to understand cybersecurity as a whole it was important
to understand the multiple interests and perspectives at play. A cornerstone of many
organizational realities in business was the belief in the free market (i.e. the market will regulate).
By this logic, an organization would have a natural interest in investing in cybersecurity and
maintaining strong data protection because companies with weak or inferior security practices
would be at a disadvantage, at least in theory, against companies with more robust protections, as
the latter can better prepare, detect, defend, and recover from attacks. However, every
interviewee challenged this assumption and argued that from a business standpoint it was not
necessarily in a company’s best interest to prioritize cybersecurity in day-to-day operations.
Participants explained that even though companies said they cared deeply about cybersecurity,
CHAPTER 4: CYBERSECURITY IN PRACTICE
103
their actions often did not match their words since security was routinely undermined for the
sake of business. The first example that came up repeatedly across multiple interviews was the
role of user friction.
The Role of User Friction
According to participants, a goal of every organization was to reduce user friction
whenever possible. In this case, interviewees referred to user friction as anything that interfered
or hindered a user’s experience as they interacted with a product, service, or process. This
steadfast adherence to streamlining the user experience was often at odds with cybersecurity best
practices. Three interviewees cited the credit card industry as an example of security protections
being “decimated” by the need for convenience. DATA_10 explained the decision by credit card
companies in the United States to disable key security features when rolling out Chip-and-PIN
credit cards.
Most of our credit cards are called Chip-and-PIN, which means it has the chip that
securely protects your authentication secrets and you’re supposed to put a PIN in to
reveal those secrets to a merchant or purchasing device. Well throughout the rest of the
world, you have to put your card in with a chip plus put your PIN to use them. But in
America the credit card industry decided that they were not going to require the PIN, that
you only had to put in the chip or the PIN in some instances but very rarely both. Without
it having both, you don’t get nearly as much protection but they were worried about the
number of people who would mess that up and it would make them mad and those people
wouldn’t use those cards. So the entire industry decided to somewhat get rid of three-
fourths of the protection of it as it’s implemented in America and nobody’s complained.
Nobody’s not using the cards, nobody’s changing their purchasing behavior. –DATA_10
This example of how long-standing values and problematic incentive models undermine
cybersecurity is in line with existing work by scholars like Wolff (2016) who point out the
complex sets of interests and motivations at play. The two interviewees with hacking
backgrounds pointed out that credit fraud could be significantly reduced if there was a collective
decision to address it in a meaningful way. However, they pointed out reducing credit card fraud
was not a main priority for many companies. As an ethical hacker ranked in the top 15 of
CHAPTER 4: CYBERSECURITY IN PRACTICE
104
hackerone’s All Time Reputation list, DATA_9 explained, “[Credit card and bank fraud] have
existed forever. Do you think fraud is difficult? Do you think people who commit fraud are like,
‘Oh this is a really hard thing for me to do.’ Hell no, it’s the path of least resistance. It’s as easy
as breathing.”
When asked why companies did not take more steps to make credit card fraud harder to
commit, DATA_10 summed up some of the pushback. He explained that credit card companies
were more interested in making sure customers could keep spending money than stopping fraud.
Everybody thinks that [credit card companies] are trying to make credit card crime
harder to commit but that is not what’s occurring. The money’s being spent to try to get
us from having our legitimate transactions blocked. That’s the fear […] the biggest
selling point for a company that tries to prevent credit card crime isn’t that they better
detect credit card crime, it’s that they’re better at not stopping legitimate transactions.
Because if you and I get one, two, or three transactions blocked in a two or three-year
period, that will annoy us so much that we might use somebody else’s product. So credit
card companies that are spending money aren’t trying to make crime less, they’re trying
to stop us from being stopped in our transactions more often. […]
Besides you having to go onto Amazon and update your credit card, it really has
very little impact on you. It isn’t like it messes up your credit record. There used to be
these horror stories ten years ago where peoples lives were ruined and I’m sure it still
kind of happens every now and then but for the most part its just a ten minute
inconvenience for people. –DATA_10
This strategy works because companies have been able to calculate, control, and mitigate
fallout from credit card fraud. But how sustainable would this approach be for different types of
technology and less temporary forms of data? The reluctance, and in many cases resistance, to
adding friction to any part of the process posed a huge problem for practitioners trying to
strengthen organizational security postures and programs. Even when new security functionality
was introduced, it was not uncommon for companies to implement only a select few protections
rather than the full range of features possible. While this can make sense in the short term, it can
also pose major problems over time that leave gaps in coverage. DATA_10 gave an example of
CHAPTER 4: CYBERSECURITY IN PRACTICE
105
how companies can more securely store permanent biometric identifiers like fingerprints by
making them less valuable and harder to steal.
For every fingerprint, the picture gets converted to the points of divergence and it looks
like a star constellation. A really smart biometric company could then turn that star
constellation into a matrix grid –some grid that represents this dot is at A2, this dot is at
D3, this dot’s at F4- and they can convert those points where the dots are into some type
of alphanumeric representation. They can even hash that so that is what gets stored in the
database. If somebody steals the database, the fingerprints aren’t there. The star
constellation representation of the fingerprint is not there. All they get is this number or a
hashed out result that really is of no value. So there are things that biometric collecting
vendors should do to make that stolen database a lot less valuable to an attacker but 99%
of them don’t do it. –DATA_10
Participants were very clear that security in itself was not fundamentally oppositional to
business. Better security could actually help companies’ bottom line by opening up more
opportunities to take advantage of data in safe and secure ways. However, in order for this to
happen, leadership had to be willing to invest in policies, procedures, and governance structures
that almost always required at least some level of disruption to business operations. While legacy
systems were no doubt an obstacle to improving cybersecurity, over half of the interviewees
pointed out that the problem is not necessarily only technical in nature. More secure solutions
already existed –they just did not exist without adding friction. DATA_10 explained:
We can truly make things more secure -we can make the whole internet more secure, we
can make transactions more secure, we can prevent viruses, worms, Trojans, spam,
phishing, all of that fairly easily. But there is no will for the people to want to do that. As
much as they complain, they really aren’t complaining that much. Because if they really
cared, it would change overnight. –DATA_10
The majority of interviewees expressed similar skepticism about the effectiveness of new
security developments being rolled out into the mainstream. This was because companies were
not simultaneously implementing the associated safeguards and controls needed to keep these
identifiers safe. DATA_9 described it as “layering good ideas on top of foundations of garbage.”
He stated, “the unfortunate reality is [that these are] good plans if you can actually do the other
CHAPTER 4: CYBERSECURITY IN PRACTICE
106
side of it. I know exactly what will happen because it’s what happens every single time. We’ll
have a great idea, we’ll half implement it, and then we’ll spend the next 20 years playing catch-
up trying to figure out how to do it more effectively.”
Multi-Factor Authentication. The best example that came up organically across
numerous interviews was multi-factor authentication (MFA). Participants explained that while
the idea was good in theory, there were two major problems that arose, 1) MFA only addressed
one aspect of a larger problem and 2) MFA was commonly misunderstood as a cure-all for
hacking. Although MFA has become much more common in recent years, critical weaknesses in
its deployment were discovered as early as the 1990s. DATA_9 provided context for how and
why MFA has struggled to achieve its potential due to the lack of additional safeguards needed
to keep new information and processes safe.
When two-factor [authentication] was shiny and new, every genius in the security field
was like, ‘two-factor, it’ll change your life.’ You can get a code to your phone, which
means you have two passwords but it didn’t pan out like that. It worked for a couple of
years and then hackers, the majority of whom were teenagers not to mention actual
threat actors like overseas intelligence groups, realized they could essentially either
intercept those 2FA (two-factor authentication) codes or –which we have seen a lot more
in the last couple years– they could just hack the actual service provider. All of those
service providers have been owned for that reason.
I can tell you comfortably before I was working computer security, I knew lots of
instances where people I know were trying to target a specific individual and they’d go
hack Bell Canada or they’d hack AT&T because it was easier to hack AT&T than hack
the person. So it was like ‘I can hack AT&T and then I can actually just go and port this
number to a new phone myself and then I can just receive their 2FA codes and do a
password reset.’ That’s an example where the logic was good –two passwords, SMS,
data-bound, it’s out-of-band– but the reality of the situation was it just didn’t pan out like
that. –DATA_9
Contributing to this problem is a general lack of understanding about how MFA fit into
the existing ecosystem. DATA_10 spoke about how this had contributed to misalignment in
strategies which made it easier for attackers to breach systems.
CHAPTER 4: CYBERSECURITY IN PRACTICE
107
I can hack any MFA solution at least four or six ways, at least, and most of them ten to
twelve ways. But there is this belief among people that when they start using MFA that all
of the sudden they’re a lot more protected than they used to be. Now that is the case for
some scenarios but it really doesn’t protect you in most of the hacking scenarios. The
vast majority of hacking is tricking somebody into running a Trojan on their workstation
and MFA doesn’t help with that at all. In some instances MFA can actually make it
easier. –DATA_10
Despite their strong opinions, the outside security consultants acknowledged that while it
is easy for them to point fingers and rattle off long lists of problems that organizations need to fix,
in-house security was incredibly difficult and involves a completely different set of challenges
and pressures that they were largely immune from.
In-House Security Perspectives. Up to this point, the discussion has primarily focused
on perspectives from offensive security specialists who take an adversarial approach to helping
protect organizations from attack (usually from outside the organization). While their
contributions highlight important weaknesses in the cybersecurity environment, it was also
important to hear from people on the inside who specialized in defensive or more conventional
approaches to cybersecurity.
Of the in-house security experts, three talked about how they attempted to balance
security against business. In general, they described the need to reduce user friction combined
with partial implementation of functionality as a major disadvantage, especially when trying to
gain the trust and buy-in of users. One interviewee who served as a director of Governance and
Risk Management at a university with over ten years experience in computer security explained
her perceptions of the situation:
One of the hardest things to do in our area is you don’t necessarily want to impede the
business. You want to be able to enable things to grow and flourish. Sometimes people
are so unaware and they think ‘security by obscurity. Nobody even knows who we are so
we’re good’ but that’s actually not the case. –DATA_4
CHAPTER 4: CYBERSECURITY IN PRACTICE
108
As mentioned in this quotation, there are many different ways companies can go about
practicing security and it is not uncommon for security to take a backseat to convenience and
other priorities. However if cybersecurity is going to improve across the private sector, dominant
organizational realities that subscribe to ‘security by obscurity’ mentalities need to be laid to rest.
One chief information security officer (CISO), described his approach, which involved helping
those around him better understand the risks associated with particular actions. DATA_3, a chief
information security officer (CISO) with over twenty years experience, explained “As a security
team and a risk team, we’re not here to say no. We’re here to say ‘here are the tradeoffs.’ Now
ultimately I can say no. I don’t say no a lot as long as the people that I’m speaking with, the ones
who are willing to take the risk, 1) understand it and 2) are at the right level to accept that risk or
not.”
In contrast to the outside security consultants, in general, the in-house security
interviewees were expected to have a broader perspective on the data protection landscape. They
spoke at length about the challenges trying to protect users’ data when so much personal
information has already been exposed. DATA_2 described the magnitude of this problem and
how that impacted the ways he can protect his organization.
The core thing I would tell you here is when you talk about securing data, the fallacy is
that the data exists someplace that you can secure it. In fact, your health data is probably
in a 100,000 different places right now. How do you secure that? You don’t even know
who has it. That doesn’t mean we shouldn’t try. It doesn’t mean it’s not important but it’s
almost like this horse has already gone out of the barn. –DATA_2
Because data no longer existed in one location, the ability to link unrelated pieces of data
and reconstruct profiles of people made these particular practitioners very aware of their
CHAPTER 4: CYBERSECURITY IN PRACTICE
109
responsibility to keep track of and limit the sharing of personal information whenever possible.
4
At the same time, interviewees were candid about the pressures they faced from multiple
stakeholders to share large amounts of data both inside and outside their organizations. They
pointed out that despite steps taken by their organization to mitigate risk, there was always
pressure to conform to expectations and norms of the field. DATA_3 explained why it continues
to be so difficult for companies to prioritize cybersecurity despite a desire to do so:
Business is more the driver than the risk conversation. […] I think it is changing,
however, what ends up happening is two things. One is the money is so great it’s hard to
turn your head. The other thing is when there are crises like Covid-19. What takes
priority and what takes the backseat? Yes Covid-19 is extraordinary but organizations
experience small crises all the time and they make tradeoffs and sometimes their
tradeoffs will trump even the best risk management program that’s out there. So yes it’s
changing, but it hasn’t changed enough. –DATA_3
As the examples above highlighted, there were many reasons why it was extremely
difficult for organizations to prioritize security. Not only were there enormous financial
incentives for companies to allow data to be accessed and shared more freely, there were also
many layers of institutional barriers and resistance that must be addressed. DATA_11 believed it
ultimately came down to the priorities of an organization and whether management was willing
to accept the tradeoffs. He summed up the situation:
4
It should be noted that this high level of concern expressed by these interviewees may
or may not be unique to the field. Since they worked in higher education, it was possible that this
sector might have had a slightly different outlook and relationship with users and data compared
to traditional companies and business models but that is a possibility to be explored in future
iterations of this research.
CHAPTER 4: CYBERSECURITY IN PRACTICE
110
I would say a lot of it boils down to the organization and their approach to security. Do
they treat it as something important and serious? Again it's more than just a cost for the
business, it also competes in terms of productivity. The more hoops you have to jump
through for security, the less time your people have to do other things. If you force people
to say, enter a password, well now they're going to have to stop and enter their password
and that's going to lose a little bit of productivity. Now if you add two factor
authentication in there, they're going to have to get a text or enter an app and figure out
a code that they enter. Now you've lost even more productivity. A lot of times security and
things like productivity are at odds and you end up with people making business
decisions that seem like a good idea ‘yes we'll fix that maybe in three months,’ and
assuming nothing happens then that's okay but you're definitely taking a risk with
decisions like that. –DATA_11
Inability to Verify and Hold Third-Parties Accountable
Regardless of specialty, every interviewee cited third parties as a weak link in the
cybersecurity chain. Participants described certain norms and decorum for engaging with third
parties that discouraged practitioners from scrutinizing these company’s security standards too
closely. While there were indirect ways to evaluate third party security, it was rare to be able to
actually test it, which forced practitioners to infer security strength based on ancillary data.
DATA_2 talked about the process of being forced to trust that vendors were actually doing what
they said:
We have security addendum that we attach to all of these engagements that these
companies have to sign-off on. So they see our security addendum and they agree to it.
Now do we know for a fact that they’re doing this? Well for long-term contracts, we’ll
have an internal audit but we don’t do that on day one. So are there any vendors out
there who are eager to get contracts whose security may not be quite as good as the
agreement says it is? [The answer is yes.]. –DATA_2
Interviewees pointed out that there was a myriad of indirect ways that security problems
could be discovered if not disclosed by the company itself. The problem was many of these
vendor or third party risk management strategies often relied heavily on self-report data from the
organization in question and assumed that the organization knew what it was doing in the first
place. Examples of risk management strategies included filling out periodic questionnaires,
checking compliance with common regulation (e.g. HIPAA, PCI-DSS, HITECH, etc.), industry
CHAPTER 4: CYBERSECURITY IN PRACTICE
111
standards (e.g. NIST 800-53, ISO 27001, etc.), and pulse checks that asked companies to provide
information about what they were doing in response to short prompts. Similar to DATA_2,
DATA_4 talked about being forced to trust third party companies but also gave examples of how
she discerned whether companies knew what they were doing.
[The problem is] you have to trust then verify. So they may say, ‘Yes I do all these
wonderful things’ and then you start asking for evidence and then you can - if they have
an online presence or an application- you can run security tests like [penetration tests],
you can run vulnerability reports, and there are ways you can get indicators if what
they’re saying on paper matches what they’re actually doing. […] That’s where you start
to discern if they actually know what they’re doing. If there’s some sort of variance there,
you’re going to want to push on that. You’re going to want to ask them more questions
and figure out are they actually protecting it the way they need to. So that’s where a third
party program in any organization is really, really critical. –DATA_4
Security Testing. All participants noted that security testing was a vital part of risk
management and important for verifying third party security assurances as well as discovering
unknown problems or vulnerabilities in the network. As an offensive security specialist,
DATA_9 described some of the challenges he faced working as a penetration tester and being
forced to trust assurances provided by third party companies without actually testing them.
Quite commonly you have to implicitly trust that the third party is secure. You don’t have
permission to test them. They might give you a report that says they’re secure but that’s
kind of my favorite part. You’re relying on the company that did that security test to know
what they’re talking about. –DATA_9
DATA_10 described a similar experience:
You [actually] can’t look [under the hood] and you’re ultimately going ‘well it’s in this
vendor’s best interest to keep my information private, secure, and they probably will do
it.’ But the truth is you have no idea how many times your information is being stolen,
how you’re being identified, how it’s being sold. […] The reality is everyone thinks
they’re protecting everything well and the vast majority of the world is not protecting
things well. Even if you are me, and I consider myself one of the best in the world, most of
the time you can’t assure anything. It’s all smoke and mirrors and lies. –DATA_10
When asked to explain the logic behind why security researchers were not allowed to
directly test the security of third parties, most could not provide a detailed justification for this
phenomena. They argued that from a security standpoint, this reasoning did not make since it
CHAPTER 4: CYBERSECURITY IN PRACTICE
112
severely inhibited their ability to test the security of the company (i.e. the exact job they were
hired to do) but that was just how things worked. It took a while to get a clear, definite answer
but eventually one participant, a 30+ year veteran privacy and cybersecurity attorney working at
a major international law firm, was able to shed light on why security researchers were generally
not permitted to test third party security. She explained it was largely a business decision and the
main goal was not to disrupt operations or cause problems like data breaches for the company or
its other clients. Third party security could be assessed indirectly and there were many
alternative methods for doing this.
What you don’t want is like a penetration test of a cloud provider and you end up pinging
somebody else’s data and causing an incident with somebody else’s data that’s not your
company’s data. Those things have happened where the penetration test is used to just
see ‘can we penetrate the exterior of this cloud provider’ for example. What’s supposed
to happen is you land only in that company’s data. But if you’re in a public cloud and
you have a section cordoned off on a public cloud, there have been situations where the
penetration test- which was supposed to be happening only on that company’s data- ends
up accessing by mistake and creating a data breach for another company. So we don’t
want the testing for one company to make a data incident for another company that
wasn’t doing the testing or create instability on the network.
The best thing to do is look at other things like getting a third party security
auditor or there are digital watermarks you can put on your data. If you see your data in
a location it shouldn’t be, according to the agreement, that gives you a good sense
without having to necessarily make the vendor’s system unstable or get into proprietary
algorithms that are patented and not public or breach IP. Basically, [you don’t have to]
violate somebody else’s rights to ensure your own are okay. So there are other things
companies can do like the watermarking of the data, encrypting the data, or other things
to help protect. –PRIVACY_11
While the decision might not have made sense from a security perspective, it did from a
business standpoint. This industry-wide norm of not allowing researchers to directly test third
party security came down to competing priorities. Dominant realities within the field prioritized
preserving business relations, avoiding liability, and not disrupting business over identifying
security threats deeply hidden in a network. The primary goal across the private sector was to
keep business moving and aggressive testing of third party security was deemed unnecessary in
CHAPTER 4: CYBERSECURITY IN PRACTICE
113
most situations. Whether the security community liked it or not, this was the dominant norm in
the field and it would not change unless there was a conscious effort across the public and
private sector to reorient dominant priorities and realities around a new approach. When asked
how this aligned with his experience, DATA_11 agreed with the norms and logic described by
PRIVACY_11 but also acknowledged that this approach left major holes in network security.
I can see why organizations are reticent to allow just any third party to go poking around
however they want. They try to organize these things into actual organized penetration
tests probably so that they can distinguish the real attacks from the fake attacks or at
least be able to test their defenses. But yeah that's definitely one of those things where
there's etiquette involved as well. Like if you see someone scanning your network and
you're paying attention, you'd be like ‘hey, why are you doing this.’ You might anger
some folks too if they are paying attention by probing their security so it's always the best
practice to kind of ask ahead of time, ‘can we test this?’ But it does seem weird. It seems
like we should be able to [test third party security]. If it's secure it'll be secure in the face
of anybody but that's not really how things happen in practice unfortunately. –DATA_11
Security researchers also pointed out shortcomings related to indirect testing methods.
They explained indirect assessments were largely reactive in their approach. Data usually had to
be exposed in order for a threat to be discovered, whereas more proactive approaches could
allow researchers to potentially detect threats before data escaped. Additionally, they also argued
that many of these indirect methods tended to be confirmatory rather than exploratory. This
meant the methods were designed to confirm whether a specific threat existed but often were not
good at detecting new problems that had yet to be discovered. An example of this would be
assessing the security of a house and how easy it would be for an unauthorized person to enter.
When investigating how someone gained access into the house, confirmatory approaches would
check common points of entry such as doors and windows to look for any signs of forced entry
to ‘confirm’ whether the intruder came in that way or not. Exploratory approaches would follow
a similar process but also look for clues or signs of unusual activity around the property to
identify possible points of entry that were less obvious or highly specific to that house. While
CHAPTER 4: CYBERSECURITY IN PRACTICE
114
exploratory approaches are not necessary for every situation, the growing complexity of
networks and interconnected infrastructures often meant that each company’s technical systems
were unique. This made common vulnerabilities harder to identify leading to gaps in testing and
audits. DATA_9 recounted his experience conducting audits and how common it was for basic
problems to go unnoticed or get overlooked during the audit process.
I’ve done testing for major financial organizations and this is one you’ll hear from any
tester who’s actually good at what they do. The smart companies rotate the company they
get to do the security audit every quarter. For a lot of the big banks they’ll actually hire
multiple security companies to do the same audit because they understand that a lot of
the time it may be missed by the first 20 people that do the test. That is crazy if you really
think about that. If you’re a doctor and it’s the 21st doctor who figures out you’ve got a
knife in the side of your ribcage, we wouldn’t respect that profession. Whereas with
computer security we’re like ‘oh yea, you missed the ribcage and knife occasionally,
that’s just a part of the profession.’ That to me is definitely the scariest thing that I’ve
seen in this space. But it seems like a lot of the people that talk with authoritative tones
and say they take privacy seriously and do all the penetration tests –granted some of
them do, don’t get me wrong. I’m not for a second saying that every single business that
says that is lying through their teeth– I’m saying that a significant chunk of them are
negligent and don’t actually understand what that means. For every company that’s
doing it right, there’s a hundred companies that aren’t. –DATA_9
While this disconnect in testing strategy was not the fault of any one person or group, it
did highlight larger systemic problems within the industry that showed how vital accountability
measures like audits and security tests could be undermined by industry norms that
disproportionately favored compliance and business efficiency. Although the majority of
interviewees complained about the ‘trust then verify’ approach, they also admitted it was just a
natural part of the job. They hoped these alternative methods were enough to protect the
organization but in the final analysis there was very little that could be done without interfering
with existing norms and arrangements. These examples also called attention to the pressure put
on cybersecurity practitioners, both in-house and outside consultants, to get out of the way of
business as much as possible. According to DATA_11 despite the number of alternative
assurances that can be given, security ultimately comes down to the prerogative of each
CHAPTER 4: CYBERSECURITY IN PRACTICE
115
individual organization. An organization’s need combined with the amount of time and effort
leadership is willing to invest will eventually determine how cybersecurity plays out.
There are ways to do those things effectively but again, how much work are you willing
to put in? If you are a mom and pop retail store and you're not really protecting a ton of
very valuable information, you're probably not going to take the time to do that. But if
you're a bank, then it makes a heck of a lot more sense. Each organization has to be
thinking about these things in terms of their organization. –DATA_11
A major theme throughout all study interviews was participants desire to have a larger
collective conversation about the underlying assumptions and systematic issues plaguing the
field. They argued progress was impossible if stakeholders were not willing to acknowledge,
understand, and address the deeply ingrained problematic incentive models and practices within
the field. This phenomena was not limited to company, group, or sector but instead represented a
systemic issue across society that must be addressed collectively.
Conclusion
By understanding factors that impeded the development and practice of cybersecurity
within organizations, this study added nuance to ongoing conversations about how to meet the
demands of the current cybersecurity environment. Using semi-structured interviews pilot study
found that improving cybersecurity within organizations required overcoming strongly held
beliefs and practices that are misaligned with enhanced cybersecurity postures. Interviewees
were adamant that although organizations say they cared and were committed to improving
cybersecurity, the reality was high levels of cybersecurity were not required –and often not even
desirable– when running a business. Every interviewee was careful to specify that this was not
the case for every company; however, the decision to prioritize business operations over
cybersecurity was not limited to a handful of organizations but was a pervasive and normalized
expectation in the field. Interviewees cited the industry’s reluctance to increase user friction as a
major barrier to improving security, which severely limited the adoption of stronger policies and
CHAPTER 4: CYBERSECURITY IN PRACTICE
116
practices. Interviewees also pointed out how current patterns of practice discouraged additional
verification of third-party companies. Participants were unanimous in their opinion that real,
meaningful change would only happen when catastrophic security events occur that are too big
to ignore.
Securing the public and private sector can not only be about locking down networks and
reminding individuals to practice good cyber-hygiene habits. There also needs to be additional
research that focuses on top-level leadership to ensure that management at all levels and types of
companies are willing and able to take the necessary steps to prioritize security and protect their
organizations. Finding money to update legacy systems is one challenge but there also need to be
efforts to understand how technical capabilities interact with organizational realities, incentive
models, business interests, and human behavior. As this pilot study showed, studying the
shortcomings of cybersecurity on the ground and its origins in organizational culture and human
decision-making was important for the future of cyber policy and securing technical systems. By
investigating the way corporate and organizational cultures discouraged more comprehensive
security practices from being deployed, this study argued that the future of information security
lay in understanding the larger collective practices and cultures that comprise, enable, reinforce,
and undermine current approaches to cybersecurity. Unless more is understood about how
cybersecurity gets enacted and reproduced in practice, efforts to enforce, regulate, and secure this
space will continue to fall short.
CHAPTER 5: DISCUSSION
117
Chapter 5: Discussion
An overarching theme throughout the three studies in this dissertation was the limitations
posted by institutions. As practitioners attempted to implement more protections around data,
they struggled to mobilize resources and respond to jolts and crises such as new laws, threats,
and regulations. Despite efforts to align competing priorities and organizational realities around
data, practitioners found it difficult to obtain buy-in and support from senior leadership. This
often lead to internal programs and approaches that were reactive in nature responding to
mandates already in place rather than getting out in front of developments and proactively
planning to leverage upcoming changes.
Differences in Sample
There are a number of possible reasons why findings from the current study differed from
previous research, namely that of Bamberger and Mulligan (2015) and Waldman (2020). As
previously mentioned, the current sample was focused on practitioners at SMEs rather than Big
Tech or Fortune 500 companies. This could explain some of the variation as high-status
companies like those included in previous samples operated on a much different environment
able to lobby and influence policy in ways most SMEs can not. Additionally, many of these
companies had entire departments devoted solely to interpreting laws, ensuring compliance, and
defending the company against lawsuits and judgments whereas SMEs were often lucky to have
internal staff working on privacy issues full-time.
An additional factor to consider would be temporal differences in when research took
place. Bamberger & Mulligan’s (2015) work was published three years before the GDPR and
five years before the CCPA took effect. The regulatory environment would have looked very
CHAPTER 5: DISCUSSION
118
different with almost no global standards or comprehensive data privacy or protection mandates
in place. While the GDPR factored heavily into Waldman’s (2020) research, most of his work
had been completed prior to the CCPA coming into effect. As this dissertation has discussed, the
CCPA represented a major turning point in US regulatory policy that challenged established
systems of practice.
Resistance
A major distinction between studies 1 and 2 and previous privacy on the ground
scholarship was the concept of resistance. While Bamberger and Mulligan (2015) and Waldman
(2020) focused on chief privacy officers (CPOs) resisting institutional arrangements imposed on
the organization through external jolts and crises, this dissertation highlighted the different ways
privacy practitioners were resisted by the institutions they were apart of. Additionally, within
both existing narratives the discussion centered on embedding privacy norms within
organizations whereas the current study called attention to the challenges faced by practitioners
as embedded actors within a system. This subtle distinction had major implications for
interpreting concepts like agency, ambiguity, and uncertainty. For both Bamberger & Mulligan
(2015) and Waldman (2020) CPOs operated from positions of opportunity and authority. Their
social positions and standing within the organization imbued them with agency, backed by
supportive environments that afforded them resources to pursue organizational changes with
minimal influence from institutional pressures. On the other hand, this research uncovered a
different perspective where privacy practitioners operated largely from a position of constraint
and limitation. Rather than being supported by existing infrastructures, they often found
themselves pushing back against institutional logics that viewed advances in privacy and security
to be at odds with business interests and dominant organizational realities.
CHAPTER 5: DISCUSSION
119
Performative Privacy and Symbols of Compliance
Although findings between this research and previous scholarship might seem stark, there
was a significant amount of confluence in terms of the similar phenomena seen on the ground.
For instance, like Waldman (2020) this project also found evidence that vital accountability
measures became ‘symbols of compliance’ standing in for actual compliance. However,
according to participants, the decision to ‘perform’ privacy was not made unilaterally and often
resulted from compromises that needed to take place in order to keep the organization moving.
Study findings suggested that the performance of privacy as a 'symbol of compliance’ in SMEs
was more the result of CPO and privacy practitioner’s lack of agency and ability to meaningfully
change established organizational processes rather than a conscious decision to undermine
privacy for the sake of the company’s bottom line. How risk got framed was a strategic decision
by practitioners to mobilize change in light of limited resources and tepid support from the
organization and company leadership.
Lack of Practitioner Agency
Variations in Actor Social Position
One of the reasons why study interviewees found it difficult to respond to jolts and crises
was the lack of institutional support due to their social position within decentralized
organizational structures. Unlike Bamberger & Mulligan (2015) and Waldman (2020) whose
interviewees were centrally located within their organizations with easy access to senior
management, participants in this study were often isolated within their organizational structures
and struggled to bridge the divide between departments and secure buy-in from senior leadership
and employees across the company. Getting departments to work together or even on the same
page was a major challenge since participants often lacked the formal authority or standing
CHAPTER 5: DISCUSSION
120
within their organization to require collaboration or changes. In these situations, privacy took a
backseat to business and practitioners focused on making programs compliant while minimizing
disruption to the company.
Differences in Field-Level Conditions
Differences in field-level conditions could also help explain some of this variation in
practitioner agency. Research by Bamberger & Mulligan (2015) and Waldman (2020) described
the privacy landscape as ripe with opportunities for pursuing different institutional arrangements.
This contrasted with current study findings that suggested environments for SMEs were slightly
different in terms of resources and enabling conditions. According to Dorado’s (2005) typology
of action, the organizational environments described within this project could be characterized as
opportunity opaque which equated to high levels of institutionalization (i.e. current norms,
values, and practices were very regimented and deeply engrained in everyday practice) and low
heterogeneity (i.e. there were certain acceptable ways of doing things that did not allow for much
deviation). Opportunity opaque environments tended to discourage actor agency by making it
harder to pursue actions that diverged from the established practices or behaviors. In this case,
existing organizational logics and realities made it difficult for practitioners to develop
approaches that pushed beyond accepted industry norms due to concerns about legitimacy and
not wanting to attract unwanted attention or regulatory scrutiny. Big Tech and Fortune 500
companies like those interviewed by Bamberger & Mulligan (2015) and Waldman (2020) had
much higher status than SMEs and presumably could navigate field-level conditions differently
which allowed for more aggressive and innovative strategies to be pursued.
CHAPTER 5: DISCUSSION
121
Dealing with Ambiguity and Uncertainty within Organizations
The unstable US regulatory environment constituted an acute field level problem for
SMEs. Unlike Bamberger & Mulligan (2015) and Waldman (2020) whose CPOs saw
opportunity in legal ambiguity, study interviewees found the lack of clear standards and
guidelines to be a major challenge when implementing privacy protective programs. The lack of
certainty around legislation and regulatory policy made difficult for practitioners to advocate for
different paths forward and secure buy-in from leadership and department management. One
point of contention between interviewees was how to best enact privacy. Without the resources
to go above and beyond compliance mandates, companies defaulted to safest option, which was
usually staying with the pack and following a minimum standards approach to compliance.
Mobilizing Change
One of the most common ways study participants overcame lack of agency was through
interpersonal communication. Using a bottom-up approach, interviewees described reaching out
and working with individual departments in an attempt to bring multiple stakeholders together. A
popular technique involved using risk framing to establish complementary goals and
understandings around data and privacy across the organization. The majority of interviewees
aligned with Bamberger & Mulligan’s (2015) conception of risk management as a key driver for
operationalizing privacy on the ground. Framing the conversation in these terms helped offset
some of the ambiguity and uncertainty related to the regulatory environment by focusing
attention on areas of increased liability. The vocabulary of risk made clear the threats dominant
practices and organizational realities posed to the company and its bottom line. Talking about
data retention in terms of ‘liability’ and potential for lawsuits was more effective for attracting
attention and generating institutional support than speaking about ‘information governance,’
CHAPTER 5: DISCUSSION
122
‘privacy compliance,’ and ‘data ethics.’ Even though organizations were not quite on track to get
rid of unnecessary data, this common language offered a useful avenue to leverage resources by
establishing shared meaning across departments. According to participants, this was necessary to
begin the process of adopting new organizational realities and practices around data.
While interviewees acknowledged critiques by Waldman (2020) about the problems of
treating data retention as a means of minimizing risk to companies (rather than protecting
consumers from data-related harms), some pointed to their company’s individual efforts to
combat this while others provided insight into the larger context of their environment.
Interviewees pointed out that despite obligations to improve data governance, data was
continually being generated and collected at a faster rate than ever before. They stressed that it
was becoming increasingly unrealistic to expect companies to keep track of every piece of data
moving through an organization especially when the original intent of these systems was to
extract as much data from people as possible, not protect them. One way to account for the
proliferation of data, ambiguity in laws, and lack of standards around data privacy was to treat
privacy as a process. Even if practitioners could not protect data from being compromised, they
could protect the organization from liability if they could establish they had acted in a reasonable
manner. This often came down to providing documentation that justified their actions and proved
some level of planning and preparation went into this process.
Interviewees were often keenly aware of industry-wide criticisms that argued many of
these actions that supposedly demonstrated compliance (e.g. audits, data inventories, etc.) did not
actually enhance data protection for consumers. These findings aligned with research by
Waldman (2020) who also recognized these tensions within his sample. “Many of these
professionals, however, are torn between their privacy advocacy and the need to influence policy
CHAPTER 5: DISCUSSION
123
with a seat at the table or the ear of the executive. Standing in the way of corporate profits by
saying ‘no’ too often or rejecting profitable products for privacy reasons can erode trust with
executives, damage their ability to get things done, or get them fired and replaced by someone
more pliant” (Waldman, 2020, p. 833). Study participants were extremely conscious of the
neoliberal pressures on them to advance organizational interests and protect the company at all
costs (even if it meant putting consumer interests second). When asked how they viewed the
privacy environment from the standpoint of a consumer, many expressed pessimism saying as a
whole the field was not doing enough to protect individuals. However, when speaking from a
practitioner standpoint they acknowledged the major challenges and limitations facing the field
and noted that while organizations are doing what they could it was unlikely that the state of
consumer privacy would greatly improve anytime soon.
CHAPTER 6: CONCLUSION
124
Chapter 6: Conclusion
While decisions around policy and regulation continue to take shape, this research
advocated for more emphasis to be placed on the experiences of privacy and security
practitioners within organizations. As this dissertation showed, these communities have vital
insights and expertise about the resiliency of organizations, specifically how data gets protected
in practice, which can greatly inform the development and implementation of future privacy and
cybersecurity strategies. This work represented an opportunity to expand the boundaries of
privacy and cybersecurity by building on current research aimed at reducing vulnerabilities and
addressing shortcomings in law and policy in practice.
Drawing on research from privacy law and organizational studies, this dissertation
examined how data privacy and security practitioners understood and navigated data-related
uncertainties in their everyday interactions. While much of the existing research focused heavily
on individual level decision-making processes, this dissertation adopted a more systems
approach to study how privacy and security got enacted and reproduced within organizations in
practice. Results explored the dynamic forces that affected data privacy and security in practice,
specifically how traditional ways of defining, measuring, and protecting data and privacy failed
to account for the values, assumptions, and practices ingrained and reinforced by the US sectoral
approach to privacy law. Although agency was touted as a fundamental value of US culture and
privacy law, the ability to exercise agency was limited even for practitioners tasked with
managing data and risk within organizations. Deeper analyses of practitioner accounts revealed
recursive practices and problematic incentive models that were characterized by the performance
of compliance and forced trust in technology and information systems. Given the push in
CHAPTER 6: CONCLUSION
125
California and across the US for stronger data privacy and security laws, this line of research is
especially important for closing gaps that characterize the patchwork approach to US privacy
policy.
This research contributed to existing privacy and cybersecurity literatures by
demonstrating the value of studying practitioners as embedded actors in a system. By focusing
on the challenges encountered by practitioners as they implemented privacy and security within
organizations, this dissertation presented a slightly different perspective that added nuance to
current understandings in the field. Contextualizing results within existing bodies of literature
was especially important for advancing the field since this study’s sample included a wider range
of practitioners and experiences, not just those drawn from elite corporations. Findings diverged
from existing scholarship possibly due to its focus on practitioners at smaller, less resourced
companies. Interviewees experiences were characterized by constraint as opposed to Bamberger
& Mulligan (2015) and Waldman (2020) whose narratives were characterized by opportunity.
Although ambiguity, risk, and performance of privacy were central themes within this study’s
data, interviewees provided a different lens to interpret what these themes meant and how they
operated in practice.
Study Limitations
This dissertation was severely impacted by the Covid-19 pandemic, which posed major
challenges to data collection. Data collection commenced in the beginning of 2020 and had to be
moved completely online with very little notice. While ideally this study would have included
additional methods such as participant observation, local public health and University restrictions
did not allow for any in-person fieldwork to take place meaning every aspect of data collection
from recruitment, networking, and interviews took place online. Without face-to-face
CHAPTER 6: CONCLUSION
126
interactions, recruitment took longer as I was forced to rely on my networks and cold emails to
recruit potential participants.
Although this initially took additional time and effort on my part to research and recruit
participants, I was ultimately very successful with approximately 70% of potential interviewees
contacted agreeing to be interviewed for the study (the other 30% either declined or did not
respond). The extra time taken to connect and explain the purpose of the research as well as why
I had reached out to that individual specifically resulted in an extremely knowledgeable,
experienced, and high quality sample that could speak about not only their work within their
organization but trends and the overarching evolution of their industry at large. One unexpected
upside to conducting virtual interviews was the candor participants brought to interviews. Being
at home allowed interviews to speak more freely and openly about sensitive topics compared to
public spaces or at their workplace. Another benefit of virtual data collection was the
perspectives I was able to gather recruiting more participants outside of California and even the
United States.
It should be noted that results from this study were not meant to be representative of the
entire field or all practitioners. Findings represent a snapshot or small segment of the practitioner
perspective within a specific period of time as organizations adapted to the GDPR and CCPA
mandates. The identification of themes and similarities among practitioner experiences was
important for furthering understanding of how privacy and security was enacted and reproduced
in practice. Although far from complete, this dissertation offered a glimpse into the rich tapestry
of experiences that make up the field of privacy and cybersecurity in the United States.
CHAPTER 6: CONCLUSION
127
Future Directions
Future studies should continue to explore how organizations are responding to data
privacy and security mandates in the United States. Every state that passes legislation related to
data privacy or protection has the ability to fundamentally change the regulatory landscape
across the United States. Future research should continue tracking these developments and look
to understand their impact on organizational privacy and security policies. Emphasis should also
be placed on documenting practitioner experiences throughout all different levels of the private
sector. As this dissertation has shown, the effects of legislation are not felt equally across sectors
or types of companies. Additionally, the US sectoral approach to data regulation opens the door
for comparative studies between not only between the United States, European Union, Canada,
China, Brazil, and other countries with comprehensive data regimes but also across individual
states. Studies can examine the effects of state comprehensive data privacy laws (e.g. California,
Colorado, Virginia, and Utah) in practice to see if and how laws diffuse beyond their original
jurisdictions. Future research can also compare and contrast how these mandates interact in
practice and whether consumer privacy is better protected in states with comprehensive privacy
laws versus those without any formal laws on the books. Additional studies should also look to
incorporate different methods such as participant observation, focus groups, and surveys to allow
for triangulation of findings and explore the phenomena in greater depth.
REFERENCES
128
References
Abdelnour, S., Hasselbladh, H., & Kallinikos, J. (2017). Agency and institutions in Organization
Studies. Organization Studies, 38(12), 1175–1792.
Acosta, L. (2018). The right to respect for private life: Digital challenges, a comparative-law
perspective. European Parliamentary Research Service.
https://www.europarl.europa.eu/RegData/etudes/STUD/2018/628240/EPRS_STU(2018)6
28240_EN.pdf
Alpert, D. (2020). Beyond request-and-respond: Why data access will be insufficient to tame Big
Tech. Columbia Law Review, 120(5), 1215–1254.
Anant, V., Donchak, L., Kaplan, J., & Soller, H. (2020, April 27). The consumer-data
opportunity and the privacy imperative. McKinsey & Company.
https://www.mckinsey.com/business-functions/risk-and-resilience/our-insights/the-
consumer-data-opportunity-and-the-privacy-imperative
Andrew, J., & Baker, M. (2021). The General Data Protection Regulation in the age of
surveillance capitalism. Journal of Business Ethics, 168, 565–578.
Auerbach, C. F., & Silverstein, L. B. (2003). Qualitative data: An introduction to coding and
analysis. New York University Press.
Baik, J. S. (2020). Data privacy and political distrust: Corporate ‘pro liars,’ ‘gridlocked
Congress,’ and the Twitter issue public around the US privacy legislation. Information,
Communication & Society, 1–18.
REFERENCES
129
Bajak, F. (2021, May 12). Tech audit of Colonial Pipeline found ‘glaring’ problems. The
Associated Press. https://apnews.com/article/va-state-wire-technology-business-
1f06c091c492c1630471d29a9cf6529d
Bamberger, K. A., & Mulligan, D. K. (2011). Privacy on the books and on the ground. Stanford
Law Review, 63(247), 247–315.
Bamberger, K. A., & Mulligan, D. K. (2015). Privacy on the ground: Driving corporate
behavior in the United States and Europe. MIT Press.
Banisar, D., & Davies, S. (1999). Global trends in privacy protection: An international survey of
privacy, data protection, and surveillance laws and developments. Journal of Computer &
Information Law, 18(1), 1–112.
Bassett, G., Hylender, C. D., Langlois, P., Pinto, A., & Widup, S. (2021). Data breach
investigations report. Verizon. https://www.verizon.com/business/resources/reports/dbir/
Battilana, J., & D'Aunno, T. (2009). Institutional work and the paradox of embedded agency. In
T. Lawrence, R. Suddaby, & B. Leca (Eds.), Institutional work: Actors and agency in
institutional studies of organizations (pp. 31–58). Cambridge University Press.
Battilana, J., Leca, B., & Boxenbaum, E. (2009). How actors change institutions: Towards a
theory of institutional entrepreneurship. The Academy of Management Annals, 3(1), 65–
107.
Beauvisage, T., & Mellet, K. (2020). Datassets: Assetizing and marketizing personal data. In K.
Birch, & F. Muniesa (Eds.), Assetization: Turning things into assets in technoscientific
capitalism (pp. 75–95). The MIT Press.
REFERENCES
130
Bingham, A. J., & Witkowsky, P. (2022). Deductive and inductive approaches to qualitative data
analysis. In C. Vanover, P. Mihas, & J. Saldaña (Eds.), Analyzing and interpreting
qualitative data: After the interview (pp. 133–146). SAGE Publications.
Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in
Psychology , 3, 77–101.
Braun, V., Clarke, V., Hayfield, N., & Terry, G. (2019). Thematic analysis. In Handbook of
research methods in health social sciences (pp. 843–860). Springer.
Brill, J. (2011). The intersection of consumer protection and competition in the new world of
privacy. Competition Policy International, 7(1), 7–23.
Brough, A. R., Norton, D. A., & John, L. K. (2020). The bulletproof glass effect: When privacy
notices backfire. Harvard Business School.
Burnap, P. (2021). Risk management & governance knowledge area – Version 1.1.1. CyBOK.
https://www.cybok.org/media/downloads/Risk_Management_Governance_v1.1.1.pdf
Chander, A., Kaminski, M. E., & McGeveran, W. (2021). Catalyzing privacy law. Minnesota
Law Review, 105, 1733–1802.
Charette, R. N. (2020, August 28). Inside the hidden world of legacy systems. IEEE Spectrum.
https://spectrum.ieee.org/inside-hidden-world-legacy-it-systems
Charmaz, K. (2006). Constructing grounded theory: A practical guide through qualitative
analysis. SAGE Publications.
Clark, M. A., Espinosa, J. A., & DeLone, W. H. (2020). Defending organizational assets: A
preliminary framework for cybersecurity success and knowledge alignment. Proceedings
of the 53rd Hawaii International Conference on System Sciences (pp. 4283–4292).
HICSS.
REFERENCES
131
Collins, E., & Hinds, J. (2021). Exploring workers' subjective experiences of habit formation in
cybersecurity: A qualitative survey. Cyberpsychology, Behavior, and Social Networking,
24(9), 599–604.
Dalal, R. S., Howard, D. J., Bennett, R. J., Posey, C., Zaccaro, S. J., & Brummel, B. J. (2022).
Organizational science and cybersecurity: Abundant opportunities for research at the
interface. Journal of Business and Psychology, 37, 1–29.
de Montjoye, Y.-A., Hidalgo, C. A., Verleysen, M., & Blondel, V. D. (2013). Unique in the
Crowd: The privacy bounds of human mobility. Scientific Reports, 3(1376), 1–5.
DiMaggio, P. J., & Powell, W. W. (1983). The iron cage revisited: Institutional isomorphism and
collective rationality in organizational fields. American Sociological Review, 48(2), 147–
160.
DiMaggio, P. J., & Powell, W. W. (1991). Introduction. In The new institutionalism in
organizational analysis (pp. 1–38). University of Chicago Press.
Dorado, S. (2005). Institutional entrepreneurship, partaking, and convening. Organization
Studies, 26(3), 385–414.
Dreibelbis, R. C., Martin, J., Coovert, M. D., & Dorsey, D. W. (2018). The looming
cybersecurity crisis and what It means for the practice of industrial and organizational
psychology. Industrial and Organizational Psychology, 11(2), 346–365.
Dudley, R., & Golden, D. (2021, May 24). The Colonial Pipeline ransomware hackers had a
secret weapon: Self-promoting cybersecurity firms. ProPublica.
https://www.propublica.org/article/the-colonial-pipeline-ransomware-hackers-had-a-
secret-weapon-self-promoting-cybersecurity-firms
REFERENCES
132
Eiden, K., Kaplan, J., Kazimierski, B., Lewis, C., & Telford, K. (2021, August 4).
Organizational cyber maturity: A survey of industries. McKinsey & Company.
https://www.mckinsey.com/business-functions/risk-and-resilience/our-
insights/organizational-cyber-maturity-a-survey-of-industries
Entrust. (2021, January 28). Data from entrust reveals contradictions in consumer sentiment
toward data privacy and security. Entrust. https://www.entrust.com/newsroom/press-
releases/2021/data-from-entrust-reveals-contradictions-in-consumer-sentiment-toward-
data-privacy-and-security
Exec. Order No. 14028. (2021, May 12). Executive Order on Improving the Nation’s
Cybersecurity. The White House. https://www.whitehouse.gov/briefing-
room/presidential-actions/2021/05/12/executive-order-on-improving-the-nations-
cybersecurity/
Farzanehfar, A., Houssiau, F., & de Montjoye, Y.-A. (2021). The risk of re-identification
remains high even in country-scale location datasets. Patterns, 2(3), 1–8.
Garud, R., Hardy, C., & Maguire, S. (2007). Institutional entrepreneurship as embedded agency:
An introduction to the special issue. Organization Studies, 28(7), 957–969.
Germain, T. (2021, May 19). New dark patterns tip line lets you report manipulative online
practices. Consumer Reports. https://www.consumerreports.org/digital-rights/dark-
patterns-tip-line-report-manipulative-practices-a1196931056/
Goodyear, M. P. (2020). A rising tide lifts all consumers: Penumbras of foreign data protection
laws in the United States. Richmond Journal of Law & Technology, 27(2), 1–50.
REFERENCES
133
Gundersen, M. (2020, December 3). My phone was spying on me, so I tracked down the
surveillants. NRKbeta. https://nrkbeta.no/2020/12/03/my-phone-was-spying-on-me-so-i-
tracked-down-the-surveillants/
Hartzog, W., & Richards, N. (2020). Privacy's constitutional moment and the limits of data
protection. Boston College Law Review, 61(5), 1687–1761.
Health Resources and Services Administration. (2019, October). What is telehealth? How is
telehealth different from telemedicine? The Office of the National Coordinator for Health
Information Technology. https://www.healthit.gov/faq/what-telehealth-how-telehealth-
different-telemedicine
Hill, M., & Swinhoe, D. (2021, July 16). The 15 biggest data breaches of the 21st century. CSO
Online. https://www.csoonline.com/article/2130877/the-biggest-data-breaches-of-the-
21st-century.html
Holm, P. (1995). The dynamics of institutionalization: Transformation processes in Norwegian
fisheries. Administrative Science Quarterly, 40(3), 398–422.
Hoofnagle, C. (2010, May). New challenges to data protection study - Country report: United
States. European Commission Directorate-General Justice, Freedom and Security.
https://papers.ssrn.com/sol3/papers.cfm?abstract_id=1639161
Hoofnagle, C. J., van Der Sloot, B., & Borgesius, F. Z. (2019). The European Union General
Data Protection Regulation: What it is and what it means. Information &
Communications Technology Law, 28(1), 65–98.
REFERENCES
134
Hoogstraaten, M. J., Krenken, K., & Boon, W. (2020). The study of institutional
entrepreneurship and its implications for transition studies. Environmental Innovation
and Societal Transitions, 36, 114–136.
Huang, K., & Pearlson, K. (2019). For what technology can’t fix: Building a model of
organizational cybersecurity culture. Proceedings of the 52nd Hawaii International
Conference on System Sciences (pp. 6398–6407). HICSS.
IBM Security Services. (2014). IBM cyber security intelligence index report. IBM.
https://i.crn.com/sites/default/files/ckfinderimages/userfiles/images/crn/custom/IBMSecu
rityServices2014.PDF
IBM Security. (2018, March). The significant gap between data privacy and consumer trust.
IBM Security. https://newsroom.ibm.com/IBM-security?item=30435
IBM Security. (2021). Cost of a Data Breach Report. Armonk, NY: IBM.
Jeong, J. J., Mihelcic, J., Oliver, G., & Rudolph, C. (2019). Towards an improved understanding
of human factors in cybersecurity. 2019 IEEE 5th International Conference on
Collaboration and Internet Computing (CIC) (pp. 338–345). IEEE.
Jeyaraj, A., & Zadeh, A. (2020). Institutional isomorphism in organizational cybersecurity: A
text analytics approach. Journal of Organizational Computing and Electronic Commerce,
30(4), 361–380.
Jeyaraj, A., & Zadeh, A. H. (2021). Exploration and exploitation in organizational cybersecurity.
Journal of Computer Information Systems, 1–14.
Kalhoro, S., Rehman, M., Ponnusamy, V., & Shaikh, F. B. (2021). Extracting key factors of
cyber hygiene behaviour among software engineers: A systematic literature review. IEEE
Access, 9, 99339–99363.
REFERENCES
135
Kaminski, M. E. (2019). Binary governance: Lessons from the GDPR's approach to algorithmic
accountability. Southern California Law Review, 92, 1529–1616.
King, J. (2020, October 30). Dark patterns and the CCPA. Retrieved from The Center for
Internet and Society: https://cyberlaw.stanford.edu/blog/2020/10/dark-patterns-and-ccpa
King, J., & Stephan, A. (2021). Regulating privacy dark patterns in practice–Drawing inspiration
from California Privacy Rights Act. Georgetown Technology and Law Review, 5(1), 251–
276.
King, N. (1994). The qualitative research interview. In C. Cassel, & G. Symon (Eds.), In
Qualitative methods in organizational research: A practical guide. SAGE Publications.
Lancieri, F. (2022). Narrowing data protection's enforcement gap. Maine Law Review, 74(1), 1–
68.
Lapowsky, I. (2019, March 17). How Cambridge Analytica sparked the great privacy awakening.
WIRED. https://www.wired.com/story/cambridge-analytica-facebook-privacy-
awakening/
Lawrence, T. B., Suddaby, R., & Leca, B. (2009). Introduction: Theorizing and studying
institutional work. In T. B. Lawrence, R. Suddaby, & B. Leca (Eds.), Institutional work:
Actors and agency in institutional studies of organizations (pp. 1–27). Cambridge
University Press.
Leca, B., Battilana, J., & Boxenbaum, E. (2008). Agency and institutions: A review of
institutional entrepreneurship. Harvard Business School Working Paper, No. 08-096.
Harvard Business School.
REFERENCES
136
Levin, A., & Nicholson, M. J. (2005). Privacy law in the United States, the EU and Canada: The
allure of the middle ground. University of Ottawa Law & Technology Journal, 2(2), 357–
395.
Linden, T., Khandelwal, R., Harkous, H., & Fawaz, K. (2020). The privacy policy landscape
after the GDPR. Proceedings on Privacy Enhancing Technologies, (1), 47–64.
Lively, T. K. (2022, June 2). US state privacy legislation tracker. IAPP.
https://iapp.org/resources/article/us-state-privacy-legislation-tracker/
Lohrmann, D. (2021, January 22). 2020 data breaches point to cybersecurity trends for 2021.
GovTech. https://www.govtech.com/blogs/lohrmann-on-cybersecurity/2020-data-
breaches-point-to-cybersecurity-trends-for-2021.html
Lounsbury, M., & Ventresca, M. (2003). The New Structuralism in Organizational Theory.
Organization, 10(3), 457–480.
Mandalari, A. M., Dubois, D. J., Kolcun, R., Paracha, M. T., Haddadi, H., & Choffnes, D. (2021).
Blocking without breaking: Identification and mitigation of non-essential IoT traffic.
Proceedings on Privacy Enhancing Technologies (PoPETs), 1–21.
May, B. E., Chen, J., & Wen, K. W. (2004). The differences of regulatory models and internet
regulation in the European Union and the United States. Information & Communications
Technology Law, 13(4), 259–272.
Meyer, J. W., & Rowan, B. (1977). Institutionalized organizations: Formal structure as myth and
ceremony. American Journal of Sociology, 83(2), 340–363.
Miles, M. B., & Huberman, A. M. (1994). Qualitative data analysis: An expanded sourcebook
(2nd ed.). SAGE Publications.
REFERENCES
137
Moshell, R. (2004). ...And then there was one: The outlook for a self-regulatory united states
amidst a global trend toward comprehensive data protection. Texas Tech Law Review,
37(2), 357–432.
Mulligan, S. P., Freeman, W. C., & Linebaugh, C. D. (2019). Data protection law: An overview.
Congressional Research Service. https://crsreports.congress.gov/product/pdf/R/R45631
Murphy, E., Dingwall, R., Greatbatch, D., Parker, S., & Watson, P. (1998). Qualitative research
methods in health technology assessment: A review of the literature. Health Technology
Assessment, 2(16), iii-ix.
Nahra, K. J. (2016). Is the sectoral approach to privacy dead in the U.S.? Bloomberg Bureau of
National Affairs. https://www.wiley.law/article-Is-the-Sectoral-Approach-to-Privacy-
Dead-in-the-US
National Security Memorandum. (2021, July 28). National Security memorandum on improving
cybersecurity for critical infrastructure control systems. The White House.
https://www.whitehouse.gov/briefing-room/statements-releases/2021/07/28/national-
security-memorandum-on-improving-cybersecurity-for-critical-infrastructure-control-
systems/
Nicola, F. G., & Pollicino, O. (2020). The balkanization of data privacy regulation. West Virginia
Law Review, 123(1), 61–115.
NIST. (2021). Improving the nation's cybersecurity: NIST’s Responsibilities under the Executive
Order. National Institute of Standards and Technology.
https://www.nist.gov/itl/executive-order-improving-nations-cybersecurity
Nobles, C. (2018). Botching human factors in cybersecurity in business organizations.
HOLISTICA–Journal of Business and Public Administration, 9(3), 71–88.
REFERENCES
138
Ohlhausen, M. K., & Okuliar, A. P. (2015). Competition, consumer protection, and the right
[approach] to privacy. Antitrust Law Journal, 80(1), 121–156.
Owen, K. O., & Dietz, A. S. (2012). Understanding organizational reality: Concepts for the
change leader. SAGE Open, 2(4), 1–14.
Padgett, D. K. (2012). Qualitative and mixed methods in public health. SAGE Publications.
Parks, A. (2022). Unfair collection: Reclaiming control of publicly available personal
information from internet data scrapers. Michigan Law Review, 120, 1–43.
Peltz-Steele, R. J. (2012). The new American privacy. Georgetown Journal of International Law,
44, 365–410.
Pernot-Leplay, E. (2020). EU influence on data privacy laws: Is the U.S. approach converging
with the EU model? Colorado Technology Law Journal, 18(1), 101–124.
Pike, E. R. (2020). Defending data: Toward ethical protections and comprehensive data
governance. Emory Law Journal, 69(4), 687–743.
Poehlmann, N., Caramancion, K. M., Tatar, I., Li, Y., Barati, M., & Merz, T. (2021). The
organizational cybersecurity success factors: An exhaustive literature review. In K. Daimi,
H. R. Arabnia, L. Deligiannidis, M. S. Hwang, & F. G. Tinetti (Ed.), Advances in
Security, Networks, and Internet of Things (pp. 377–395). Springer Cham.
Rahman, T., Rohan, R., Pal, D., & Kanthamanon. (2021). Human factors in cybersecurity: A
scoping review. IAIT2021: The 12th International Conference on Advances in
Information Technology (pp. 1–11). IAIT.
Ranking Digital Rights. (2020). 2020 Ranking Digital Rights Corporate Accountability Index.
https://rankingdigitalrights.org/index2020/
REFERENCES
139
Reagin, M., & Gentry, M. V. (2018). Enterprise cybersecurity: Building a successful defense
program. Frontiers of Health Services Management, 35(1), 13–22.
Reegård, K., Blackett, C., & Katta, V. (2019). The concept of cybersecurity culture. Proceedings
of the 29th European safety and reliability conference (pp. 4036–4043). European Safety
and Reliability Association.
Renaud, K., & Ophoff, J. (2021). A cyber situational awareness model to predict the
implementation of cyber security controls and precautions by SMEs. Organizational
Cybersecurity Journal, 1(1), 24–46.
Richards, N., Serwin, A., & Blake, T. (2022). Understanding American privacy. In G. González,
R. Van Brakel, & P. De Hert (Eds.), Research handbook on privacy and data protection
law (pp. 60–72). Edward Elgar Publishing, Inc.
Rippy, S. (2021, September 16). US State privacy legislation tracker. IAPP.
https://iapp.org/resources/article/us-state-privacy-legislation-tracker/
Roos, A. (2006). Core principles of data protection law. Comparative and International Law
Journal of Southern Africa, 39(1), 103–130.
Sasse, M. A., & Rashid, A. (2021). Human factors knowledge area – Version 1.0.1. CyBOK.
https://www.cybok.org/media/downloads/Human_Factors_v1.0.1.pdf
Schwartz, P. M. (2019). Global data privacy: The EU way. New York University Law Review, 94,
771–818.
Security Magazine. (2021, August 4). Data breaches in the first half of 2021 exposed 18.8 billion
records. Security Magazine. https://www.securitymagazine.com/articles/95793-data-
breaches-in-the-first-half-of-2021-exposed-188-billion-records
REFERENCES
140
Seo, M. G., & Creed, W. (2002). Institutional contradictions, praxis, and institutional change: A
dialectical perspective. The Academy of Management Review, 27(2), 222–247.
Shaty, O. (2021). Lessons learned from analyzing 100 data breaches. Imperva.
Solove, D. J., & Hoofnagle, C. J. (2006). A model regime of privacy protection. University of
Illinois Law Review, 2, 357–404.
Strauss, A. L., & Corbin, J. (1990). Basics of qualitative research: Grounded theory procedures
and techniques. SAGE Publications.
Swire, P., & Kennedy-Mayo, D. (2020). U.S. Private-Sector Privacy Law and Practice for
Information Privacy Professionals. IAPP.
Tejay, G., & Klein, G. (2021). Editorial introduction. Organizational Cybersecurity Journal,
1(1), 1–4.
The Westin Research Center. (2021, October 1). Global comprehensive privacy law mapping
chart. IAPP. https://iapp.org/resources/article/global-comprehensive-privacy-law-
mapping-chart/
Thornberg, R., & Charmaz, K. (2014). Grounded theory and theoretical coding. In U. Flick (Ed.),
The SAGE handbook of qualitative data analysis (pp. 153–169). SAGE Publications.
Turner, S., Galindo Quintero, J., Turner, S., Lis, J., & Tanczer, L. M. (2021). The exercisability
of the right to data portability in the emerging Internet of Things (IoT) environment. New
Media & Society, 23(10), 2861–2881.
van Zadelhoff, M. (2016). The biggest cybersecurity threats are inside your company. Harvard
Business Review. https://hbr.org/2016/09/the-biggest-cybersecurity-threats-are-inside-
your-company
REFERENCES
141
Wachter, S. (2018). Normative challenges of identification in the Internet of Things: Privacy,
profiling, discrimination, and the GDPR. Computer Law & Security Review, 34, 436449.
Waddell, K. (2021a, March 16). California's new privacy rights are tough to use, Consumer
Reports study finds. Consumer Reports.
https://www.consumerreports.org/privacy/californias-new-privacy-rights-are-tough-to-
use-a1497188573/
Waddell, K. (2021b, March 16). Why it's tough to get help opting out of data sharing. Consumer
Reports. https://www.consumerreports.org/privacy/why-its-tough-to-get-help-opting-out-
of-data-sharing-a7758781076/
Waldman, A. E. (2018). Designing without privacy. Houston Law Review, 55(3), 659–727.
Waldman, A. E. (2020). Privacy law’s false promise. Washington University Law Review, 97(3),
773–834.
Waller, S. W., Brady, J. G., Acosta, R. J., Fair, J., Morse, J., & Binger, E. (2011, May).
Consumer protection in the United States: An overview. European Journal of Consumer
Law, 1–36.
Wolff, J. (2016, March 10). Why is the U.S. determined to have the least-secure credit cards in
the world? The Atlantic. https://www.theatlantic.com/business/archive/2016/03/us-
determined-to-have-the-least-secure-credit-cards-in-the-world/473199/
Woodard, R. L. (2004). Is your medical information safe? A comparison of comprehensive and
sectoral privacy and security laws. Indiana International & Comparative Law Review,
15(1), 147–182.
REFERENCES
142
Yeh, C.-L. (2018). Pursuing consumer empowerment in the age of big data: A comprehensive
regulatory framework for data brokers. Telecommunications Policy, 42(4), 282–292.
APPENDIX 1: INTERVIEWEE DEMOGRAPHICS
143
Appendix 1: Interviewee Demographics
Participant
ID
Specialty
Domain
Job Title Experience Employment
Sector
Gender
PRIVACY_1 In-house
Privacy
Chief Privacy
Officer &
Director of
Policy and
Privacy
10+ years Higher Education Male
PRIVACY_2 In-house
Privacy
Chief Privacy
Officer
5+ years Higher Education Female
PRIVACY_3 Law &
Regulation
Partner
Attorney |
Cyber Law,
Data Security,
and Privacy
10+ years Law Male
PRIVACY_5 In-house
Privacy
Campus
Privacy
Officer
10+ years Higher Education Female
PRIVACY_6 Outside
Privacy
Consultant
Partner | Data
Privacy &
Security
Consulting
25+ years Consulting Female
PRIVACY_8 Law &
Regulation
Partner
Attorney |
Privacy &
Data Security
10+ years Law Male
PRIVACY_9 Advocacy Privacy
Counsel &
Lead
7 years Think Tank Female
PRIVACY_10 Outside
Consultant
Counsel |
Privacy Law
25 years Law Male
PRIVACY_11 Law &
Regulation
Partner
Attorney |
Privacy &
30+ years Law Female
APPENDIX 1: INTERVIEWEE DEMOGRAPHICS
144
Cybersecurity
DATA_2 In-house
Security
Chief
Information
Officer
30+ years Higher Education Male
DATA_3 In-house
Security
Chief
Information
Security
Officer
20+ years Higher Education Male
DATA_4 In-house
Security
Director,
Governance
and Risk
Management
10+ years Higher Education Female
DATA_5 Cybersecurity
Reporting
Cybersecurity
Reporter
10+ years Media/Journalism Male
DATA_8 Cybersecurity
Reporting
Journalist |
Digital
Security
20+ years Media/Journalism Male
DATA_9 Ethical
Hacking
Offensive
Security
Specialist
10+ years Consulting Male
DATA_10 Outside
Security
Consultant
Computer
Security
Consultant
30+ years Consulting Male
DATA_11 Product/
Manufacturin
g
Threat
Research
Engineer
17 years Technology Male
DATA_12 Product/
Manufacturin
g
Technical
Product
Manager
8 years Technology Female
APPENDIX 2: STUDY 3 INTERVIEWEE DEMOGRAPHICS 145
Appendix 2: Study 3 Interviewee Demographics
Participant
ID
Cybersecurity
Domain
Job Title Experience Employment
Sector
Gender
DATA_2 In-house Security Chief
Information
Officer
30+ years Higher Education Male
DATA_3 In-house Security Chief
Information
Security
Officer
20+ years Higher Education Male
DATA_4 In-house Security Director,
Governance
and Risk
Management
10+ years Higher Education Female
DATA_5 Cybersecurity
Reporting
Cybersecurity
Reporter
10+ years Media/Journalism Male
DATA_9 Ethical Hacking Offensive
Security
Specialist
10 years Consulting Male
DATA_10 Outside Security
Consultant
Computer
Security
Consultant
30+ years Consulting Male
DATA_11 Product/
Manufacturing
Threat
Research
Engineer
17 years Technology Male
DATA_12 Product/
Manufacturing
Technical
Product
Manager
8 years Technology Female
PRIVACY_11 Law &
Regulation
Privacy &
Cybersecurity
Attorney |
Partner
30+ years Law Female
Abstract (if available)
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
Privacy for all: enshrining data privacy as a civil right in the Information Age
PDF
The power of social media narratives in raising mental health awareness for anti-stigma campaigns
PDF
The data dilemma: sensemaking and cultures of research in the media industries
PDF
The social meaning of sharing and geocoding: features and social processes in online communities
PDF
Staying ahead of the digital tsunami: strategy, innovation and change in public media organizations
PDF
Gamification + HCI + CMC: effects of persuasive video games on consumers’ mental and physical health
PDF
Privacy-aware geo-marketplaces
PDF
Forests, fires, and flights: examining safety and communication practices within aerial firefighting teams
PDF
Virtually human? Negotiation of (non)humanness and agency in the sociotechnical assemblage of virtual influencers
PDF
Practice-inspired trust models and mechanisms for differential privacy
PDF
Communicating organizational knowledge in a sociomaterial network: the influences of communication load, legitimacy, and credibility on health care best-practice communication
PDF
Coding centric approaches for efficient, scalable, and privacy-preserving machine learning in large-scale distributed systems
PDF
Exploring the complexities of private sector influence: the case of student data privacy policy
PDF
Crowdsourcing for integrative and innovative knowledge: knowledge diversity, network position, and semantic patterns of collective reflection
PDF
"Wonders and wishes": contexts and influences of Black millennials' childhood television viewership
PDF
The new normal
PDF
Social motivation and credibility in crowdfunding
PDF
Discovering home
PDF
A seat at the table: navigating identity, power, and safety in work meetings
PDF
Generative foundation model assisted privacy-enhancing computing in human-centered machine intelligence
Asset Metadata
Creator
Kwong, Jillian K.
(author)
Core Title
Translating data protection into practice: exploring gaps in data privacy and cybersecurity within organizations
School
Annenberg School for Communication
Degree
Doctor of Philosophy
Degree Program
Communication
Degree Conferral Date
2022-08
Publication Date
07/25/2024
Defense Date
04/22/2022
Publisher
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
communication,cybersecurity,data privacy,data protection,data security,OAI-PMH Harvest,organizations
Format
application/pdf
(imt)
Language
English
Contributor
Electronically uploaded by the author
(provenance)
Advisor
McLaughlin, Margaret (
committee chair
), Maclay, Colin M. (
committee member
), Riley, Patricia (
committee member
), Saxon, Leslie A. (
committee member
)
Creator Email
jilliakk@gmail.com,jilliakk@usc.edu
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-oUC111375212
Unique identifier
UC111375212
Legacy Identifier
etd-KwongJilli-10982
Document Type
Dissertation
Format
application/pdf (imt)
Rights
Kwong, Jillian K.
Type
texts
Source
20220728-usctheses-batch-962
(batch),
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Access Conditions
The author retains rights to his/her dissertation, thesis or other graduate work according to U.S. copyright law. Electronic access is being provided by the USC Libraries in agreement with the author, as the original true and official version of the work, but does not grant the reader permission to use the work if the desired use is covered by copyright. It is the author, as rights holder, who must provide use permission if such use is covered by copyright. The original signature page accompanying the original submission of the work to the USC Libraries is retained by the USC Libraries and a copy of it may be obtained by authorized requesters contacting the repository e-mail address given.
Repository Name
University of Southern California Digital Library
Repository Location
USC Digital Library, University of Southern California, University Park Campus MC 2810, 3434 South Grand Avenue, 2nd Floor, Los Angeles, California 90089-2810, USA
Repository Email
cisadmin@lib.usc.edu
Tags
communication
cybersecurity
data privacy
data protection
data security
organizations