Close
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
Cryptographic imaginaries and networked publics: a cultural history of encryption technologies, 1967-2017
(USC Thesis Other)
Cryptographic imaginaries and networked publics: a cultural history of encryption technologies, 1967-2017
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
1
CRYPTOGRAPHIC IMAGINARIES AND NETWORKED PUBLICS:
A CULTURAL HISTORY OF ENCRYPTION TECHNOLOGIES, 1967-2017
by
Sarah Myers West
A Dissertation Presented to the
FACULTY OF THE USC GRADUATE SCHOOL
UNIVERSITY OF SOUTHERN CALIFORNIA
In Partial Fulfillment of the
Requirements for the Degree
DOCTOR OF PHILOSOPHY
COMMUNICATION
August 2018
Copyright 2018 Sarah Myers West
2
Acknowledgements
If citation is feminist practice, so too is the acknowledgment of all the people without whom
I could not have written this dissertation. I am truly grateful for the advice and support I received
throughout the process, and the lessons I will carry forward in to my future work. I can only hope
that I pay forward a fraction of the kindness and generosity I have received.
The members of my dissertation committee are not only world class scholars, but
incredibly generous and kind mentors. Manuel Castells in particular is an advisor who I was
fortunate to meet early on in my academic career, and with whom I’ve had the rare opportunity to
engage in years of wide-ranging intellectual debates that I look forward to continuing in the years
to come. Thank you for your constant support and encouragement over the past seven years, and
to the Wallis Annenberg Graduate Research Fellowship for affording me the ability first to develop
the ideas at the core of this project with Manuel and then to chase them around the globe.
Mike Ananny forever changed how I understand sociotechnical infrastructure in his course
on Networked Publics: his deep, rigorous and engaged mode of critique models the kind of
scholarship I aspire to. I was especially grateful for his regular reminders that whatever anxieties
I felt along the way were ‘stage-appropriate’. Christina Dunbar-Hester introduced me to Science
and Technology Studies mid-way through my doctoral studies, and in so doing showed me how to
critically engage with the social implications of technologies. Our conversations about the process
and craft of academic research and writing will continue to guide me for years to come. Even
before he agreed to serve on my committee, Chris Kelty’s work introduced me to new ways of
thinking about the cultural practices of software developers. He helped me find my way at several
3
critical junctures by prompting me to search for the deeper questions underlying my research
interests.
The faculty at the Annenberg School for Communication and Journalism are truly some of
the best in the world, and I am deeply thankful for the years I’ve spent there. Sarah Banet-Weiser
reinvigorated my interest in cultural theory, modeling what it means to be a feminist scholar with
style and grace. Chris Smith led me through the cultural studies canon early on in a semester that
I continue to look back on fondly. Larry Gross, Tom Goodnight, Peter Monge and Patti Riley gave
me a solid foundation in communication scholarship that has helped me to situate the
interdisciplinary approach I take in my own research with greater confidence. I’m likewise grateful
to the Annenberg administrative staff, who provide an invaluable source of support to doctoral
students: Anne-Marie Campian, Sarah Holterman, Jordan Gary, Pauline Martinez, Billie Shotlow,
and Christine Lloreda, thank you so much for all you do for us.
In my last year of the PhD program I joined a stellar group of budding scholars, whose
feedback bolstered me throughout the writing process and kept me accountable: Sam Chan,
Courtney Cox, Cat Duffy, James Lee, Rogelio Lopez, Perry Johnson, and Raffi Sarkissian, I will
miss meeting with you every other week. Without MC Forelle, our ringleader, I quite honestly
don’t think I would have gotten this done. I’m so grateful for our friendship and look forward to
years more collaborating together. Nathalie Maréchal, our years meeting in far-flung corners of
the world have been an absolute delight, and I hope they continue for years to come.
I was able to present early versions of these ideas at several points along the way. Thanks
to those who brainstormed, listened, read, and gave feedback on early drafts of this work,
especially Francois Bar, Nancy Baym, Finn Brunton, Gabriella Coleman, Kate Crawford, Patrick
4
Davison, Joan Donovan, Kevin Driscoll, Mary Gray, Tarleton Gillespie, Colin Maclay, Kate
Miltner, Patti Riley, Lana Swartz, Nathan Walter and Meredith Whittaker.
The Global Voices community has been an important part of my life since 2012 – if there’s
anything the last few years have shown me, it’s that we could use more of the empathy and joy of
this community. Rebecca Mackinnon and Ellery Biddle were early influences on how I think about
the implications of technology and human rights, and constantly inspire me to produce work that
will have an impact on the world. Matthew Stender, Dia Kayyali and especially Jillian York have
likewise been important influences on my thinking; their conviction and intelligence inspire and
remind me to be an accomplice, not just an ally.
Lastly, I would not be able to reach this point without the incredible support I receive from
my friends and family. Thanks to Kia Hays, Michael Lawson, Gabby Lim, Chris Martin, Steve
Robillard and Zander Roberts for providing healthy distractions when I needed them. To the
parents I was lucky enough to inherit, Chip and Frances, my brother- and sister-in-law Bion and
Elizabeth, Ceci, and all of the Wangs and Wests, I am so grateful for your support over the years
– if I’m successful, I will be so proud to add one more Dr. West to the fold!
My grandmother has lived a life I aspire to: a journalist, world traveler, skier and climber
who received her degree in computer science in her 60s and continues to publish her own magazine
into her late 80s. I love you and am so grateful for your influence in my life. My father instilled a
love of the pursuit of knowledge at an early age that I know led me to take this path. I hope he
would have gotten a kick out of where it’s taken me since. My mother made countless sacrifices
to ensure I had the best education and as many opportunities as possible. Thank you for everything
you’ve given me, and to Craig for your constant support. Allie, I honestly don’t think I’d ever
5
make it through without your reminders that I’ll do just fine. You and mom are truly the best family
in the world.
Finally, Han: I couldn’t have asked for a better intellectual sparring partner and companion
on these adventures that life has thrown our way. Your love and support means more than I have
words to describe: my gratitude is inscribed in every word of every page.
6
Table of Contents
Acknowledgements 2
List of Figures 7
Abstract 8
Introduction 10
Chapter 1: The Paper Labyrinth: Cryptography as a Cultural Artifact 39
Chapter 2: New Directions: The Invention of Public Key Encryption 66
Chapter 3: The Crypto Wars and Cryptographic Infrastructure 98
Chapter 4: Cypherpunks, Crypto-Anarchy, and the Interpretative Flexibility of Encryption 131
Chapter 5: Transforming the Cryptographic Community 162
Conclusion 196
References 202
7
List of Figures
Figure 2.1. Symmetric Key Crypto System 85
Figure 2.2. Public Key Crypto System 86
Figure 3.1. ‘Sink Clipper’ Advertisement 115
Figure 3.2. PGP: Source Code and Internals 122
Figure 4.1. WIRED ‘Crypto Rebels’ Cover 137
8
Abstract
This dissertation provides an account of cryptography as a counterbalance to the force of
networked connectivity. I argue that encryption technologies, which enable the transformation of
a message or data into code inscrutable to anyone save those with the key to unscramble it, can
grant individuals space to disconnect, or to regain autonomy around how and to whom we connect.
Encryption can be used to protect our privacy under the conditions of surveillance capitalism. But,
importantly, it can do so without requiring that we exile ourselves from society: encryption is itself
a form of communication technology, with mutual relationships and trust paths built into its very
architecture.
Using multi-site ethnography to trace encryption across both historical archives and
contemporary communities engaged with the production of privacy-enhancing technologies, I
consider shifts in the cultural meaning of encryption technologies in relation to networked
infrastructure. I analyze a complicated tangle of regulatory, economic, and social factors that
combined over the course of the 1960s to present day to shape the evolution of cryptographic
technologies from a monopoly by state intelligence agencies, used for spycraft and information
control in the cold war, to developing new and alternative meanings for the protection of data and
enabling of e-commerce.
Over time, encryption technologies became the focus of a new social imaginary, one that
sought to utilize encryption to reclaim sovereignty for individuals and freedom from institutional
control. This imaginary, crypto-anarchy, emerged in the 1990s out of the work of a community of
amateur cryptographers engaged in an experiment to realize the potential of encryption in a
networked age. Crypto-anarchy is founded on the idea that encryption technologies could
9
reconstruct the distribution of power and authority in society; by enabling individuals to transact
anonymously, new forms of freedom become possible. Crypto-anarchy prized a vision of an exiled
individual, removed from the physical world yet powerful on the Net, a vision well-suited to the
utopian dreams of the 1990s but which aged as connectivity infused our daily lives and our on-
and off-line selves no longer felt so distinct.
Though the pursuit of freedom from state and corporate oversight gained currency in the
wake of the Snowden revelations, other aspects of crypto-anarchy fell out of vogue. In particular,
its prizing of technical expertise as the basis of meritocracy is being challenged by the emergence
of a new imaginary grounded in inclusion and participation. Through ethnographic observation of
communities of technologists working on cryptographic projects in the present day, I trace an
imaginary still in formation: one that sees the achievement of autonomy as a collective enterprise
and seeks to use encryption to create spaces where individuals with marginalized identities or
views can gather together in association free from the gaze of the surveillance state.
10
Introduction
On June 18
th
, 2016, Facebook Vice President Andrew Bosworth circulated a memo on the
company’s internal communications system that read:
We connect people. Period. That’s why all the work we do in growth is justified. All the
questionable contact importing practices. All the subtle language that helps people stay
searchable by friends. All of the work we do to bring more communication in. The work
we will likely have to do in China some day. All of it. (Mac, Warzel & Kantrowitz, 2018).
The memo outlines an extreme interpretation of the company’s maxim: to make the world more
open and connected. Connectivity was Facebook’s driving force, its reason for being. It justified
all the company’s practices: the ‘nudges’ used to encourage its users to share more, the slow
changes in default settings to make more of their information public, the development of
sophisticated targeting techniques to sell better customized ads to advertisers.
Two years later, CEO Mark Zuckerberg repudiated this view in his testimony before
Congress about Cambridge Analytica’s misuse of data it collected from 87 million users:
It’s not enough to just connect people, we have to make sure those connections are positive.
It’s not enough to just give people a voice, we have to make sure people aren’t using it to
hurt people or spread misinformation. It’s not enough to give people control of their
information, we have to make sure developers they’ve given it to are protecting it too.
Across the board, we have a responsibility to not just build tools, but to make sure those
tools are used for good (Zuckerberg, 2018).
These two statements are an encapsulation of cultural dynamics communication scholars have
observed for decades: that we live in a “network society” (Castells, 1996) and a “culture of
11
connectivity” (van Dijck, 2013) in which social production is transforming both markets and
individual freedom (Benkler, 2006). While much scholarship about the internet has centered on its
function as a connective force, we have since begun to grapple with its long-term effects: how
networks are also infrastructures that can be used to control (Barzilai-Nahon, 2008; Benkler, 2016),
that enable internet companies to accumulate vast amounts of digital data with little transparency
(Zuboff, 2015; Pasquale, 2014; Angwin, 2014), and facilitate surveillance by state intelligence
agencies (Schneier, 2015; Deibert, 2013) that can be misused to manipulate elections (Kreiss &
McGregor, 2017). Long before Edward Snowden revealed the bulk surveillance capabilities of the
National Security Agency and its partners in the UK, Canada, Australia and New Zealand,
communication scholars have grappled with understanding the implications of networked
technologies as surveillance technologies: what do they mean for a society where we are always
watched, and are always watching one another?
Though this question can be grounded in antecedents from an analog world, recent
scholarship is exploring its computational dimensions from a variety of perspectives: examining
whether surveillance is experienced unequally by different members in a society (Browne, 2015;
Eubanks, 2017), facilitates algorithmic discrimination (Noble, 2018; Angwin et al, 2016; Joh,
2016), and is likely to create downstream harms for democratic citizenship (Tufekci, 2015;
McKelvey, 2014), spurring the need to reformulate our understanding of the ‘public’ in under
conditions of surveillance capitalism and networked citizenship (boyd, 2010; Ananny, 2015;
Ananny, 2016a). Other work has explored modes of resistance to data collection and analysis
(Brunton & Nissenbaum, 2011) such as the reappropriation of surveillance technologies to ‘watch
the watchers’ through practices of sousveillance (Mann & Ferenbok, 2013).
12
Driven by the maxim in Science and Technology Studies to explore how “things could be
otherwise”, this dissertation builds on this mode of inquiry by providing an account of encryption
technologies as a potential counterbalance to the cultural force of connectivity. Encryption
technologies, which enable the transformation of a message or data into code inscrutable to anyone
save those with the key to unscramble it, can grant individuals space to disconnect, or to regain
autonomy around how and to whom we connect.
1
As such, encryption technologies are also
communication technologies, though they are little studied from the perspective of communication
scholarship. Encryption can be used to protect our privacy under the conditions of mass
surveillance. But, importantly, it can do so without requiring that we exile ourselves from society:
drawing upon perspectives in communication studies and STS, I argue that encryption is itself a
connective technology, with mutual relationships and trust paths built into its very architecture.
In 1993, WIRED Executive Editor Kevin Kelly wrote that encryption was:
…the necessary counterforce to the net's runaway tendency to connect everything. It is a
prime technology of disconnection. The net says, ‘Just connect.’ The cipher says,
‘Disconnect.’… A cipher is the yin for the network's yang, a tiny hidden force that is able
to tame the explosive interconnections born of decentralized, distributed systems (Kelly,
1993).
Encryption never came to fulfil this vision; though it was built into the defaults of internet
infrastructure in some ways – often those that would enable commercial transactions – it stopped
short of protecting the privacy of end-users’ communications. To what can we attribute this failure?
1
Encryption can also be used for control: for centuries, its primary use was by the rulers of nation-states
in order to maintain control over information. Government agencies in much of the 20
th
century sought to
establish a monopoly over the study and use of cryptography by asserting it was a critical national
intelligence resource. Even today, many countries are considering regulations that would curb the use of
encryption by members of the public, reserving it for intelligence agencies.
13
My investigation led me through a complicated tangle of regulatory, economic, and social
factors that combined over the course of the 1970s-2000s to influence the development of
cryptographic technologies; I found no simple answers. I traced through the transformation of
cultural meanings associated with encryption as it evolved over this period in history from being
monopolized by state intelligence agencies, used for spycraft and information control in the cold
war, to developing new and alternative meanings for the protection of data and enabling of e-
commerce.
It also became the focus of a new cryptographic imaginary, one that sought to utilize
encryption to reclaim sovereignty for individuals and freedom from institutional control. This
imaginary, crypto-anarchy, emerged in the 1990s out of the work of a community of amateur
cryptographers engaged in an experiment to realize the potential of encryption in a networked age.
Crypto-anarchy is founded on the idea that encryption technologies could reconstruct the
distribution of power and authority in society; by enabling individuals to transact anonymously,
new forms of autonomy become possible. Crypto-anarchy prized a vision of an exiled individual,
removed from the physical world yet powerful on the Net, a vision well-suited to the utopian
dreams of the 1990s but which aged as connectivity infused our daily lives and our on- and off-
line selves no longer felt so distinct.
Though the pursuit of freedom from state and corporate oversight gained currency in the
wake of the Snowden revelations, other aspects of crypto-anarchy fell out of vogue. In particular,
its prizing of technical expertise as the basis of meritocracy is being challenged by the emergence
of a new imaginary grounded in inclusion and participation. Through ethnographic observation of
communities of technologists working on cryptographic projects in the present day, I trace an
imaginary still in formation: one that sees the achievement of autonomy as a collective enterprise
14
and seeks to use encryption to create new kinds of networked publics where individuals with
marginalized identities or views can gather together in association free from the gaze of the
surveillance state.
My account is, in large part, one of conflict, contestation and failure. Much of my research
surfaced hopes and dreams that never quite came into being. But I see these failures as a productive
starting point to consider how things might be otherwise: to, as Walter Benjamin once put it, “brush
history against the grain” and uncover alternative possibilities. Leo Marx (2010) warned us about
the dangerous association between technology as the “driving force of progress” – that all too often,
a focus on innovation leads us to over-account for technologies as transformative forces in the
world. The value we have placed on networked connectivity as a new and radical organizing
principle for society may indeed have led many to overlook its negative consequences. By taking
failure, conflict and breakdown at face value, I aim for this to be a generative account, and maybe
even a hopeful one. As Steven J. Jackson (2014) put it:
The world is always breaking; it’s in its nature to break. That breaking is generative and
productive….it is always being recuperated and reconstituted through repair. The question
then becomes what we make of these facts, and what we do next (223).
Motivations
The Snowden revelations were published the summer before I began my doctoral studies.
I was already involved in digital rights advocacy as an editor for Global Voices Advocacy (Advox),
a citizen media network focused on the protection of free expression and access to information
online. As articles by Glenn Greenwald and Barton Gellman revealed progressively more invasive
elements of the United States’ surveillance apparatus, I observed people around me wondering
15
how the utopian dream of the 1990s could turn into a nightmare before our very eyes. How could
the ability to network and connect with others around the world – the transformative force that
fueled our work at Advox, and helped so many of our peers to advocate for change in Tunisia and
Egypt – also be a source of oppression?
My colleagues at Advox showed me that the premise of this (admittedly naïve) question
was flawed from the outset: the Snowden revelations illustrated that the US government had
replicated at a global scale the kinds of systems already used by companies to monitor our activities
and use them to sell us consumer goods. And, as scholars of race and gender would be quick to
remind me, the ‘utopian’ dream of the internet of the 1990s was only ever utopian for some.
In the wake of the revelations, many called for wider use of encryption as a protection
against surveillance – at times treating cryptography as a technical solution to a broad social,
economic and political problem (Gürses, Kundnani & Van Hoboken, 2016; Schneier, 2015).
Responding to consumer anxiety, technology companies have deployed encryption more actively
in protecting communications on their platforms. Adoption of the HTTPS standard, a form of web
security, grew dramatically in the years following the Snowden revelations (Felt et al, 2017), and
one billion users began encrypting the content of their communications overnight when WhatsApp
implemented end-to-end encryption in its app (Metz, 2016). In turn, government agencies around
the world have identified encryption as a barrier to effective policing of terrorism threats, because
it prevents them from conducting wiretaps in law enforcement investigations. To address this
challenge, law enforcement officials in the US, UK, France, Netherlands, and Australia, among
others, have sought to mandate backdoors be introduced into the deployment of encryption in
communications technologies. The ensuing debate has made what for years was a relatively
mundane subject suddenly interesting, reviving a digital rights movement that is rallying around
16
the proliferation of encryption tools and enlisting companies like Apple and Google who utilize
encryption in their products to support them in their cause.
Observers of this debate note that it often feels as though the two sides are talking past one
another; the discussion centering as much on what encryption is capable of doing in the first place
as on how it should be regulated. Policymakers see the imperatives of privacy and security at odds
with one another, arguing though encryption is useful for the protection of privacy, it provides a
safe haven for criminals and terrorists (Homeland Security Committee, 2016). By contrast,
cryptographers and advocates see privacy and security as aligned; that encryption provides
protection against cyberattacks even as it protects the privacy of individuals (Abelson et al, 2015).
Part of the divide underlying this debate implicates a need to bridge the different languages
spoken by technical advocates on the one hand and non-technical policymakers on the other.
Convincing lawmakers of the impossibility of mathematically reconciling the desire to allow one
actor access to plaintext without opening the door for others is a challenge without a shared
technical language on which these claims can be demonstrated. As a result, the encryption debate
is indicative of the failures of language to adequately support claims on either side, made all the
more complicated by discursive slipperiness of the use of the term encryption itself.
Part of the challenge seems to be that the term encryption is often used interchangeably
with the systems it is built into. Encryption is a part of contemporary networked infrastructure,
inscribed in the structures and technologies of the internet and working invisibly to support the
things we do with it (Star & Ruhleder, 1996). Encryption technologies are behind every credit card
transaction, Bluetooth connection, and mobile phone call made by billions of users worldwide.
They are used during the authentication of connections, protecting the connection between your
17
computer browser and the servers of the websites we navigate to. They protect data at rest, ensuring
that private information stored on servers is not easily accessed or changed by third parties.
Each of these infrastructures are applications of encryption, constructed by technologists
and deployed in particular ways. And thus, there are values and ethical principles inscribed in the
depths of the systems that deploy encryption. It is my aim in this dissertation to surface some of
them, to hold them up to the light and ask questions about how and why they were implemented
in particular ways in our networked infrastructure, and not in others.
This drive led me, initially, to interrogate the history of tracking technologies in the 1990s
and early 2000s and the discourse surrounding them in their early days, before they hardened in to
infrastructure (West, 2017). I found that cryptography was critical to the development of e-
commerce, used widely to protect credit card transmissions and other kinds of financial
transactions online. But though it was built in by default into systems designed for these kinds of
behaviors, cryptographic software designed for privacy protection was hard to acquire and even
more challenging to learn how to use. Integrating accounts of technological production,
sociotechnical communities, and policymaking processes, I show how encryption became
embedded in networked infrastructures in ways that privileged the nascent commercial internet
industry while inhibiting the development of individual privacy and community-level autonomy.
But, as I found, there was a counter-narrative to the culture of connectivity that emerged at
the same time. Cryptographers and advocates envisioned alternative choices that incorporated
encryption in to networked infrastructures, placing privacy rather than connectivity at the center
of the internet. Despite this, over the course of the last 50 years, a variety of commercial, political,
and cultural forces have prevented the widespread implementation of encryption for the use of
privacy protection.
18
This dissertation considers how the cultural values associated with encryption evolved
alongside the development of the Internet. As I will argue throughout, encryption is not a singular
object of study, not just process, infrastructure or technical artifact: encryption is tied in to multiple
narratives around information and power, made all the more relevant by the emergence of the
network society (Castells, 1996). I am driven in my analysis by the question, if surveillance
capitalism has concentrated the ability to collect, analyze, and use the information we produce on
a daily basis in the hands of the state and corporations, how can we as individuals and communities
regain some semblance of autonomy?
Literature Review
Cryptography is a largely understudied area within the domains of communication and
Science and Technology Studies: though there is ample popular literature on the subject, including
several well-researched histories, academic scholarship of cryptography tends to be concentrated
in technical fields of study. This appears to be changing, however. Over the last two years a number
of new studies have begun to contribute to our critical understanding of encryption: Isadora
Hellegren’s (2017) work analyzing crypto-discourses and the work of the cypherpunks from a
political economy perspective; Karina Rieder’s (2017) analysis of Congressional hearings about
cryptography policy; Quinn Dupont and Alana Cattapan’s (2017) study of gender in the crypto
community through an analysis of the fictional characters Alice and Bob, used to explain
information exchanges in public key cryptography; Finn Brunton’s (2018) forthcoming cultural
history of digital cash, which includes an extensive treatment of the development of these ideas
within crypto communities in the 1980s and 1990s. The growth of this body of literature promises
to shed more light on the cultural valence of encryption technologies.
19
Absent this work in the early days of my research, I grounded my work in foundational
literature in communication, Science and Technology Studies, critical technology studies, and
infrastructure studies. I outline three broad analytical frameworks that were particularly influential
in shaping my conceptualization of the project below:
Critical Technology Studies
Like other scholars in Science and Technology Studies, I both focus on the role of
technology as a mediating force for society while questioning the ways in which it is privileged as
an analytical category. Though I trace evolutions in cryptographic technology as the analytical
object at the core of this study, I aim to center the processes of meaning-making among actors and
communities, rather than the function of the technology itself. My object of study is how
cryptography is understood among technologists, policymakers and users – what are the values,
associations, and imaginaries developed around cryptography, and how have they shaped the
artifact?
I was challenged at many points by a persistent strain of technological determinism in the
communities I studied, and found Leo Marx’s work to be especially helpful in making sense of
this in my analysis. As Marx notes, technology is a “hazardous concept”, and is popularly believed
to be “capable of becoming a virtually autonomous, all-encompassing agent of change” (Marx,
2010, 564). He warns scholars of technology against the assumption that technology – or the term
he prefers, the mechanic arts – is a driving force of human history. This narrative is present
throughout much of what is taught in schools about the history of mankind; we link the
development of technologies like fire and the wheel to the development of intelligence among
homo sapiens, innovations in agriculture to the advent of civilization, and inventions like steam
20
power and the engine to the industrial revolution. It lives on today in some corners of the crypto
community in the emphasis placed on encryption as a solution to mass surveillance – what Gürses,
Kundnani and Van Hoboken (2016) call “crypto as a defense mechanism” – at times at the cost of
a political reading that would attend to the racial, gendered, and classed dimensions of surveillance
programs.
Marx’s work served as a persistent reminder to maintain a position of critical distance while
analyzing such perspectives – though I also aim throughout this dissertation to acknowledge the
prominence of such teleological or determinist perspectives in the worldview of the technologists
who populate this history. In doing so, I draw on Sally Wyatt’s (2008) call to take technological
determinism seriously while being careful, wherever possible, to distinguish between the ‘real’
practices of users in the world and the images of users held by system builders.
The Politics of Technologies
As the title of my dissertation suggests, Langdon Winner’s (1980) argument that “there is
no idea more provocative than the notion that technical things have political qualities” deeply
shaped my analysis (121). Drawing on Winner, my analysis aims to surface the particular ways
that encryption embodies specific configurations of power and authority. Though Winner focuses
on singular examples of the political properties of technologies, I see the politics of encryption as
mutable and constantly evolving in relationship to the systems in which they are embedded. Indeed,
much of my analysis centers on parsing contestations over the regulatory, economic and social
factors influencing the usage and adoption of encryption alongside evolutions in the hard
materiality of cryptographic systems.
21
Other STS scholars have been likewise concerned with the relationship between power and
technology. In their book Sorting Things Out, Geoff Bowker and Susan Leigh Star argue that “all
information systems are necessarily suffused with ethical and political values, modulated by local
administrative procedures” (Bowker & Star, 1999, 321). Part of the normative project of STS is to
focus on who is left out, harmed, forgotten and unserved by existing arrangements, and how
research might rectify the situation (Star, 1999).
Information technologies are somewhat different from the tomato sorters and bridges that
were the focus of Winner’s analysis. While other technologies may embed power relations in
material form by making particular forms of action easier or more difficult, information
technologies are a means for creating, circulating, and appropriating meaning – they can shape
which forms of meaning can circulate through society (Boczkowski & Lievrouw, 2007). As a
result, information technologies play a rule in establishing hegemony, or the “winning and shaping
[of] consent so that the power of the dominant classes appears both legitimate and natural” (Hall,
1977). They have the capacity to control information flows and to support – or resist – particular
visions for society.
Communication and STS scholars have paid particular attention to the potential for power
and agency in the Internet itself as a form of infrastructure. Some see the Internet as a
sociotechnical infrastructure that maximizes individual agency and the flow of information, a view
tied both to aspects of the Internet’s technical design as well as the nature of its governance. At a
technical level, the Internet’s end to end structure and protocols that favor the horizontal
distribution of data over prioritization of content suggest a privileging of enhanced individual
agency on the network. This is reinforced by the nature of governance of the Internet: it remains,
for the most part, a space under the oversight of a collection of national and international
22
stakeholders of a variety of types, rather than a single authoritative body (with the exception of the
Internet’s addressing systems, which are governed by the multistakeholder organization ICANN).
At the same time, other scholars have contested whether both the design and practical
governance of the Internet reflect this level of agency in practice: for example, Flanagin, Flanagin
& Flanagin (2010) claim that though individual agency is built into the Internet’s design, the
technical capacity for the Internet to resist control is relatively fragile, and the current structures
that facilitate individual agency and freedom are open to political and commercial exploitation.
Moreover, the nature of the protocols that shape the Internet’s material structure are in and of
themselves a form of control, even though the form they take is one that favors enhanced individual
autonomy (Galloway, 2004). As a result, the politics of the Internet’s infrastructure must be
considered not only at the material level, but to the extent which the material elements of the
Internet are shaped by political influence. Networks are not merely metaphors for new
organizational distributions, they are material ideologies, sites of variable practices, actions and
movements (Thacker, in Galloway, 2004), and are thus open to capture. Ultimately, power and
agency on the Internet are shaped by the interactions between the social and technical – they are
not preordained, fixed, or determined.
Understanding these dynamics is useful for establishing how cryptography can shed new
light on how we create new ways of being in relation to networked technologies – how the network
society offers possibilities that can be viewed as alternately liberatory or a form of control. The
politics of cryptography will inevitably be deeply tied to the social systems it becomes embedded
in.
Infrastructure Studies
23
Lastly, infrastructure – what Paul Edwards describes as “those systems without which
contemporary societies cannot function” (Edwards, 2003) – plays an important analytical role in
this study. Drawing on the work of Edwards and Susan Leigh Star, I understand infrastructure as
both having hard technical materiality and being shaped through social processes. Infrastructure
can be as tricky a concept as technology, though it largely lacks the same normative dimensions
and associations with progress. Star and Ruhleder (1996) see it as a tool in practice, constantly in
the process of becoming and mutually shaped by its social and technical dimensions. Leonardi
(2012) similarly argues that infrastructure is sociotechnical and implies imbrication, or the
entanglement of social and material agencies. By contrast, Edwards (2003) draws more of a
distinction between the social and technical when he says “societies build infrastructures…but
because of their endurance in time, infrastructures then become the more important force in
structuring society” (195).
It’s perhaps worth exploring both the technical and social aspects of infrastructure in more
concrete detail. At various points throughout this dissertation, I attend to the hard materiality of
cryptographic infrastructure and how it evolved in tandem with the shifting social and political
relations around encryption. Many scholars of STS have worked to understand the sociotechnical
nature of artifacts by “unfolding the political, ethical, and social choices that have been made
throughout [their] development” (Bowker, Baker, Millerand & Ribes, 2010, 99). I attempt to do
so here by tracing the political, economic, and social forces that shaped and reshaped popular
understandings of cryptography as it developed alongside networked infrastructure. I draw on STS
scholarship that considers, for example, how interpretive flexibility may evolve over the life cycle
of a technological system (Hughes, 1987, Winner, 1980), whether technological systems are
primarily social, technical, or a combination of both (Bowker, Baker, Millerand & Ribes, 2010),
24
how control is enacted through the construction of technical systems (Beniger, 1989), and the role
of non-human actants in shaping technological systems (Latour, 1991, Latour, 2005).
Though collectively this literature helps to explain the sociotechnical nature of
technologies, it may be worth considering what it uniquely means for infrastructure to be
sociotechnical. Infrastructure is important because of its role in holding society together and
ensuring its smooth functioning; when it works well, infrastructure has a tendency to fade into the
background, becoming visible only upon breakdown (Star & Ruhleder, 1996). But it is perhaps
because of its sociotechnical nature that infrastructure is able to become invisible in the first place;
its taken-for-grantedness is at least in part a function of being embedded in a community of practice
(Star & Ruhleder, 1996). As anyone who has spent time in an unfamiliar country may recognize,
getting to know how things work – navigating infrastructure as simple as buying a train ticket or
making a phone call – is part of the process of integrating into a new community.
Because infrastructure is always embedded in social arrangements, it can inscribe a set of
values and ethical principles into a system (Star, 1999). The design decisions inscribed in
infrastructure reflect the societies they are embedded in as much as they do the intent of their
creators. Thus, infrastructural design may signal what is important or of value, whose voice is seen
as authoritative or marginal, or what is seen as non-controversial or mainstream (Ananny, 2016b).
From the standpoint of the technical, the material dimensions of infrastructure may inscribe
particular visions of what is important, relevant, or necessary for society. Infrastructure can also
play an important role in connecting or disconnecting different parts of a society. This was most
famously illustrated by Winner in his depiction of the city planner Robert Moses’ overpasses,
which were just high enough to let cars cross from Manhattan to Long Island, but low enough to
prohibit buses and trucks from entering them. The result of this, Winner suggests, was the creation
25
of a boundary that effectively prevented poorer people – those who relied heavily on public
transportation – from crossing in to Long Island (Winner, 1980). As Winner shows, technologies
may be shaped consciously or unconsciously by abstract social constructs – such as the desire to
separate out people of different social classes – but we can nevertheless parse these power relations
by examining the construction of an artifact. I see this dynamic as particularly salient to
cryptographic technologies that foster the ability to disconnect and to become invisible.
But power relations are not necessarily fixed once inscribed into a technical artifact: they
can be reshaped by social forces. This social dimension of technology is captured in the work of
Ronald Kline and Trevor Pinch, who demonstrate how the car, a mode of transportation first
designed for use in urban settings, was put to a wide range of creative uses when brought into rural
America: the new automobiles were alternatively used as a source of power, a tool for domestic
work, or a piece of farm machinery in addition to serving as a mode of transportation (Kline &
Pinch, 1996). The social needs of rural users led them to adapt a machine designed for one purpose
to fit a range of others – or as Kline and Pinch describe it, artifacts can have interpretive flexibility.
The notion of interpretive flexibility has an important shaping influence on the questions
that drive this project, though it is ultimately only a partial fit for the dynamics I describe. Kline
and Pinch argue that though there is often a period during which relevant social groups may
develop alternative visions for how technologies are employed, eventually they reach a stage of
closure in which a technology’s meaning becomes more or less stable. In the case of encryption,
the meaning still remains somewhat flexible – as I describe in chapter one, different actors, such
as members of the technology industry and law enforcement, have different perspectives on what
encryption is, what it does, and what it should do that can be traced across longer historical
trajectories.
26
This may be in part a question of scale; encryption is a broad category with many different
implementations, which may each have their own meaning. However, there seems to be something
unique about the power relations in cryptography that resists stabilization – perhaps it is because
it is itself so well-suited to power politicking – but my primary argument is that encryption can
embody many different understandings of the relationships between power and information, each
attuned to the unique configurations of the social system in which it is embedded.
Methodological Framework
Much of this dissertation is the product of historical, interpretive research: hours spent in
archives making sense of the physical and digital documentation left behind by technologists and
regulators. But the questions I seek to answer are very much informed by present-day concerns: I
hope to evoke new possibilities for the internet of today by considering the imaginaries of the past.
My approach to the archival work that made up the majority of the research for this project was
guided by ethnographic inquiry into the current practices of communities working on
cryptographic project – I found it necessary to trace encryption across past and present in order to
meaningfully answer my questions about its historical transformation. I thus conceptualize this
work as both cultural history and multi-site ethnography: I trace the evolving conceptions and
cultural valences of cryptography across a number of networked field sites, some of which exist
in the present and some of which are captured only in archives.
I outline these efforts in more detail below, but first will outline the frameworks that guided
my methodological choices. Early on in the process I encountered the notion of “critical technical
practice” in Phil Agre’s Computation and Human Experience. In it, he outlines the value of
developing a critical consciousness that can expand on technical practices by attending to both the
27
structural and cultural levels of explanation; drawing on both critical analysis and technical
practice in order to engage in rigorous reflection on technical ideas and practices. Though they
would not adopt these words, many of the actors in this study – those I interviewed and the authors
of the documents I encountered in archives – were actively engaged in critical technical practice.
I read documents by technologists reflecting on the implications of particular directions for and
implementations of cryptography. I conducted interviews with technologists that asked them to
reflect on their own histories and the ideas that drove them in their work. I also observed
technological production in practice by conducting field work at several conferences. Across these
sites, I made observations at three levels of scale: first piecing together a narrative of how events
unfolded, then attending to the worldview of the actors involved, and finally engaging in my own
critical reflection.
This account will, inevitably, be fragmentary and partial. There are entire domains for the
use of cryptography that are given little attention here for various reasons: for example, I provide
little insight in to evolutions in the use of encryption by state intelligence agencies after the 1970s,
primarily because materials past this point tended to remain under classification.
2
Moreover,
‘following the actors’ led me away from other important areas of development for cryptography
during this period, including movements outside of the United States and the development of
encryption technologies for incorporation into digital rights management technologies. Instead,
taking the invention of public key cryptography as my starting point, I focused more narrowly on
the uses and practices of encryption technology that are relevant to the privacy and security of
internet users. Lastly, in my pursuit of answering the questions that drove this project, I noticed
2
In fact, I suspect there are many materials before the 1970s that are likewise classified – I found the
archives of corporations, which were ‘declassified’ much earlier, to provide valuable insight into the
activities of government agencies by documenting who they met with and early drafts of project
proposals.
28
gaps in whose voices were represented in the archives: those who spoke were primarily men with
high levels of technical expertise and education, even though women and people of color were
actively involved in cryptologic enterprises during World War II.
3
I address the implications of
these absences in the conclusion, and aim to explore them further in future work.
Archival Research
I drew on several archives, both physical and digital, throughout my research. The Martin
Hellman papers at Stanford Library were an invaluable source of information on the invention of
public key encryption and contemporary issues around the National Bureau of Standards’ adoption
of the Data Encryption Standard. The Paul Armer papers at the Smithsonian Museum of Natural
History provided useful context on the period, particularly for understanding how members of the
computer industry conceived of and expressed concerns about data security and privacy from an
early stage. I was able to obtain materials from IBM Research, including ‘declassified’ records
that provided insight into the company’s early work in developing commercial uses for
cryptography. More records on IBM’s activities, as well as other companies like RSA and
Netscape, were obtained from the Computer History Museum. I also found the oral history
collections produced by the Charles Babbage Institute at the University of Minnesota to be an
invaluable resource, providing biographical narratives that could help me understand how and why
certain events unfolded.
3
See, for example: Mundy, L. (2017). Code Girls: The Untold Story of the American Women Code
Breakers of World War II. New York: Hachette Books; Fagone, J. (2017). The Woman Who Smashed
Codes: A True Story of Love, Spies, and the Unlikely Heroine Who Outwitted America’s Enemies. New
York: Dey Street Books; Williams, J. (2001). The Invisible Cryptologists: African Americans, WWII to
1956. Center for Cryptologic History, National Security Agency. Retrieved Mar. 31, 2018 from
https://www.nsa.gov/about/cryptologic-heritage/historical-figures-publications/african-americans/.
29
At each archive, I conducted general searches, often with the assistance of librarians, for
all materials relating to cryptography and encryption. At each subsequent archival visit, I added to
these searches the names of companies that were active in this space (such as RSA, Public Key
Partners, and Netscape), as well as prominent individuals who were engaged in the study of
cryptography (such as Hellman, Whitfield Diffie, Ron Rivest, Adi Shamir, Leonard Adleman, and
David Chaum), generating further sources of material to study.
In some cases, I was unsuccessful in obtaining materials. For example, I contacted RSA
Data Security to ascertain whether they had any corporate archives, but was unable to obtain a
response. I also received only a few materials from IBM due to resource limitations, though I was
able to obtain additional materials elsewhere. I suspect there may be additional information that is
still under classification by the company. Finally, I sought wherever possible to triangulate
materials in the archive with popular press materials obtained through the library, both to check
the accuracy of assertions made in the archival documents as well as to provide additional context.
The fourth chapter is informed largely by research from the Cypherpunk Archive, a
repository of emails sent between members of a listserv devoted to hobbyist cryptographers.
Drawing on this archive led to some unique methodological challenges, including reconciling
information drawn from the archive with authority records, navigating the archive without
knowing what exactly to look for (Hangal et. al, 2015), as well as making ethical decisions about
the protection of participants’ privacy and confidentiality. Given this, I will elaborate in a bit more
detail about how I addressed these challenges in the context of this study:
1. Volume and archive navigation – Making sense of the Cypherpunk Archive
required understanding its surrounding context and deciding how to navigate the data
accordingly. The listserv was created on September 21, 1992, shortly after the first in-
30
person meeting of the Cypherpunks at Eric Hughes’ house in Berkeley, California, and
continues to the present day. However, because of the philosophical views of its
administrators, who felt that spam filters constituted a form of censorship, the archive soon
became unwieldy in size: some versions contain a large amount of spam, and even filtered
versions contain more data than I could read (administrators estimated around 150 e-mails
were sent per day at a high point). Moreover, the archive is fractured: in 1997, one of the
administrators shut down the list, and it migrated to another server. A few years afterward,
at least one reporter declared the cypherpunks dead (Rodger, 2001), though future projects
proved otherwise.
In making methodological choices, I decided to read through the listserv selectively.
I read all e-mails in the first year of the archive’s formation, to get a clear sense of the
nature of the group’s debates, the range of topics discussed, and how the group established
a community culture over the course of the first year. Subsequently, I used news articles
and policy documents to flag debates of interest to provide background knowledge and
selectively read the archive during these time periods. In subsequent work I plan to
triangulate these archival findings through interviews with members of the listserv and to
identify additional discussions and debates that may have been of interest internally within
the community but did not rise to mainstream discussion.
2. Textual analysis - As another mechanism for managing volume, I used MAXQDA,
a qualitative analysis software tool, to thematically code the text.
3. Confidentiality – The privacy of listserv members is a particularly tricky issue in
this context. Ethical use of listserv material requires considering to what extent the
participants at the time knew that their information was being made public and would
31
consent to its use for research (Sixsmith & Murray, 2001). In my decision to publish
selections from the archive and attribute them to their authors, I considered the following:
a. The archive is publicly available, and has been for some time. Similar studies
have differentiated between private/semi-private sources of emails and emails
posted to public listservs, arguing that though messages may be posted publicly
authors may nevertheless wish to mask participants’ identities (Herring, 1996;
Harrison, 1998).
b. Participants within this community were uniquely primed to consider, and
technically capable of implementing, technologies that allowed them anonymity or
pseudonymity – they talk overtly in some of the early emails about the likelihood
of information said on the listserv being viewed and used in other contexts.
c. Though the majority of the emails on the listserv are attributed to real names,
there are a few participants who elected to use pseudonyms at various points in time.
The same is true of digital rights conferences today, at which some participants will
use pseudonyms to mask their real identity in the event their participation becomes
public.
Interviews and Participant Observation
Lastly, I used interviews and participant observation to study present-day cryptographic
projects in the final chapter. I provide a more detailed analysis of my methodological framework
and its relationship to my findings in that chapter, but will briefly summarize them here. I drew on
the methodological frameworks of multi-sited ethnography (Marcus, 1995) and networked field
studies (Lingel, 2017). As Marcus notes, multi-sited ethnography is particularly attuned to studies
32
of the complex social and cultural time-spaces that make up the field of science and technology
(Marcus, 1995, 104). The crypto community is exemplary of this kind of complexity: it has no
persistent physical base or even a bounded group of individuals, instead operating through
amorphous online collectives that gather, in distinct combinations, at international conferences. I
attended several of these during my field work, including the Chaos Communication Congress
(33C3), Internet Freedom Festival (2016-2018), RightsCon (2016), and the Crypto Summit 2.0
(2016).
Each of these conferences attracts a slightly different part of the crypto community: the
CCC is Europe’s largest and one of the world’s oldest hacker organizations (Kubitschko, 2017),
described to me by one member as “the place where hackers go to discuss social issues” (Interview,
Dec. 28, 2016). By contrast, the IFF is a more recent convening, and has only existed in its current
form for three years. The IFF explicitly aims to be inclusive of both technical and non-technical
people working in the digital rights space, and is designed to draw in members of non-Western
origin (Interview, May 31, 2017). Finally, RightsCon is a summit geared toward bringing together
technology companies, government representatives, lawyers, technologists, and civil society
members to discuss human rights challenges. Of the conferences I attended, it is the most formal
and oriented toward the technology industry. I also attended the ancillary Crypto Summit hosted
alongside RightsCon in 2016, which was focused specifically on addressing issues of cryptography
and policymaking and was similar in format.
I found Hintz and Milan’s (2010) work to be useful in formulating my approach to this
space. Hintz and Milan discuss challenges associated examining grassroots activism in technical
communities, highlighting the importance of bridging the social, cultural and ideological gulfs
between researchers and interview partners given widespread skepticism toward academic
33
research in these communities. Though I found most of those I interviewed to be actively interested
in participation, I did encounter resistance on one occasion from a participant concerned about the
implications of publishing research about the community. Rather than continue the interview, we
decided to pause so that we could discuss their concerns in more detail, and I incorporated them
into my approach to the project in subsequent interviews.
Hintz and Milan also discuss the methodological implications for conducting engaged
research, emphasizing the importance of building mutual, respectful relationships, designing
projects around action research that will contribute to solutions, and critically reflecting on the
identity of oneself as a researcher. They conclude that a combination of document analysis, semi-
structured interviews, and participant observation are least likely to impose the researcher’s
concepts and assumptions, the approach I have taken in this dissertation.
I have been an active participant in the digital rights community since 2012, and was often
recognized as such throughout my research. I first began my work as the Managing Editor of the
Netizen Report for Global Voices Advocacy, a weekly compilation of news and updates relating
to digital rights issues from around the world. I then went on to work as a Google Policy Fellow
for the Electronic Frontier Foundation in 2015, and participated in several digital rights
conferences in my capacity working on EFF projects. Though immensely helpful for my
familiarity with the digital rights community, this required some reflection how I would present
myself as a researcher in this space. The lines between academia and advocacy are somewhat
blurry, and there is a long lineage of academic researchers crossing over to participate in advocacy
projects and contribute the results of their research to discussions in the digital rights community.
Less frequently do academics attend these conferences for the explicit purpose of conducting
research, though this is not unheard of.
34
I made sure to be clear about my participation as an academic researcher while conducting
my field work, both by discussing my project with possible participants and participating in
sessions designed to bridge the academic-advocacy divide. Though I received an IRB exemption
for my research, I asked interviewees to review a disclosure sheet and confirm their willingness to
participate prior to engaging in an interview.
I also made my research protocols, particularly the steps I took to protect the privacy of
participants, especially visible in my research. I used a VPN, regularly updated security patches,
and opted, wherever possible, to communicate with participants over encrypted channels: Signal
was preferable, but at times we opted to use PGP as an alternative. Occasionally, a participant
would signal that they were comfortable with and preferred to communicate in the clear – that is,
on an unencrypted channel – and in the instances where this was the case, I followed the
participant’s preferences after considering and reviewing whether any sensitive material would be
discussed. Most of my interviews were conducted in person, with my type-written notes stored on
an encrypted hard drive. On occasion, I opted for hand-written notes and transcribed them after
the fact. All participants were given the opportunity to define whether their accounts would be
attributed to their real name, to a pseudonym, or to remain anonymous. Given evolving sensitivities
in the community due to the changing political climate, I opted to anonymize all accounts with one
exception: an interview with the organizer of a conference, which was substantively focused on
the conference and contained information that would be identifiable. I received the organizer’s
permission to attribute her name to this account. In general, I found that attending to my operational
security was an important element of trust-building given sensitivities around digital security in
this space; it marked me as both a member of the community and as a researcher who cared about
protecting the privacy of her participants.
35
Lessons
Though there’s a long tradition of mixed-method approaches to studying the work of
technological communities, I learned some unique lessons over the course of this project that
pertain to this particular subject matter. Each of the archives I visited, through their ordering of
materials, inclusions and exclusions, presented a particular vision of the past. Given the high level
of secrecy about cryptography, these archives nearly always felt contingent and partial – I was
often challenged to consider what documents might be missing from the record without any sure
evidence that this was the case. I sought to fill gaps in the record by tracing across them – for
example, when I received only a handful of documents from the IBM Research Archive, I sought
out others from a range of collections. This meant that the study didn’t have the sense of
‘completeness’ that it might have had I had access to the full collection on file – but it did prime
me to identify gaps, or to consider the reasons why certain materials were – and likely still are –
missing from this record. It also drove home the awareness that the authoritative record of the
history of cryptography as described in narratives by some of its key figures is itself a social
construction. As I interpreted my findings I sought to surface contradictions in accounts, mixing
of metaphors and chronological gaps throughout this history, and, where appropriate, to
foreground the work of individuals whose contributions did not receive as much attention as others.
Over time, I was also driven to consider the normative implications of my project as a result
of changes in the field. In 2016, the digital rights community began an extended process of
addressing widespread sexual abuse, some of which occurred at the conferences I attended. The
abuse was driven by highly charismatic and powerful individuals, perpetuated across networked
conference sites and communities (thus making it less visible), and encouraged by broader cultural
36
tropes of ‘rock stars’ – I address these dynamics more concretely in chapter five. I began to realize,
through my continued participation in the space and my own work reading archival texts, the role
that history plays in supporting and perpetuating these power dynamics in the community.
Historical accounts can serve as a form of myth-making, highlighting the charismatic stories and
unique, trail-blazing contributions of great thinkers in the field. In doing so, they can contribute to
the marginalization of community members by not making visible their contributions. This made
me reflect on the nature of a project like this – would I, in my choice of subject matter, be producing
a text that would perpetuate and continue the rock stars in the community? Or could I take a
different approach?
I attempted to write a more heterogeneous account of this community, not only by
including the flaws and mistakes of key figures, but by also emphasizing how many of the key
contributions in the field are communal, and derive from the work of people working together and
building on one another’s projects, rather than from the work of individuals pioneering on their
own. I had hoped for, and explicitly sought out, contributions from women, people of color, non-
Westerners and others who have largely been marginalized from this community. This is best
reflected in the last chapter of the dissertation, which draws on my ethnographic field work and
foregrounds the Internet Freedom Festival, an event that has the most diverse make-up I have seen
in this space. My efforts in the historical sections of this dissertation were limited, unfortunately,
by the fragmentary material I found in the archives. I’m unsure of whether to attribute this to a
reflection of what curators felt was authoritative, of what work was written down, published under
whose name, or the barriers to participation in departments during this time period. I hope to
address this question more thoroughly in the next stage of my work.
37
Chapter Organization
This dissertation proceeds in five parts, each dealing with a distinct period of time and set
of transformations in the underlying cultural meaning of cryptography. Chapter one provides an
explication of cryptography, grounding it as an analytical concept and providing a pre-history that
traces through several co-existing meanings that it has been associated with through the early 20
th
century. Chapter two examines the invention of public key cryptography as a moment of
transformation, considering how public key cryptography is both reflective of and a causal agent
in shifts in the social meaning of cryptography. Chapter three moves on to the question of whether
cryptography can be commercialized, examining two key cases, the RSA algorithm and PGP. It
argues that the commercialization of cryptography has been deeply shaped by an ambivalent
regulatory position by the US government that largely discouraged corporate adoption other than
its use for e-commerce. This had the effect of limiting the adoption of encryption as a form of
privacy protection and marginalizing privacy-protective encryption software. Chapter four
considers alternative visions for cryptography as they developed within what I call a hobbyist
community, the cypherpunks. I consider how the cypherpunks elaborated a new vision for
cryptography tied to notions of sovereignty, which they called crypto-anarchy. Finally, chapter
five examines new interpretations of cryptography as they are evolving in the present day. I
conclude that changes in the make-up of the crypto community are driving new visions for
cryptography grounded in social justice, as opposed to civil liberties: they see the pursuit of
autonomy through encryption as a collective, rather than individualized, pursuit.
Collectively, these chapters tell a story about cryptography as it evolved alongside
networked infrastructure: countering the reigning model of surveillance capitalism, my analysis
highlights alternative internet imaginaries where we are not tracked and our data is secured and
38
authenticated, and reveals that a lack of diversity in technical circles led to a narrow construction
of the “user” of encryption technologies, foregrounding their market value over other possible
meanings. By integrating accounts of technological production, sociotechnical communities, and
policymaking processes, I show how encryption became embedded in networked infrastructures
in ways that privileged the nascent commercial internet industry while inhibiting the development
of individual privacy and community-level autonomy, and how it became the object around which
many cryptophiles re-imagined an internet that treated autonomy, rather than connectivity, as its
central value.
39
Chapter 1: The Paper Labyrinth: Cryptography as a Cultural Artifact
Cryptography is both an art and science of communication. It enables the selective
revelation of information to some and not to others; adding asymmetries to the process of
communication that imbue messages with new forms of power. Cryptographic systems exert
control over access to information through the construction of their infrastructure and design: they
push the limits of written communication, experiment through new forms of visual representation
of an inscribed meaning, or transform it using mathematics.
But whether and to whom access to the hidden meaning in a text is selectively available is
also a social and political question. Recent policy debates over encryption reflect a struggle over
the information asymmetries that have arisen in the context of the network society (Castells, 1996).
Over the last decade, we have undergone a process of deep mediatization (Couldry and Hepp,
2016), recording the most intimate details of our selves as we move through time and space. By
incorporating technologies in to our daily habits, the amount of metadata we produce has bloomed,
leading to an infinitesimal number of data traces that may be used to optimize our lives.
As the Snowden revelations demonstrated, these data traces are not scattered to the wind,
ephemeral and fleeting. Rather, they are commoditized, mined for their economic potential and
harvested by intelligence agencies in the name of national security (Zuboff, 2015; West, 2017).
The work of surveillance scholars situates these transitions in their political and economic context
(Lauer, 2017, Schneier, 2015), observing how systems of surveillance lead to new forms of
algorithmic control (Pasquale, 2014) and are interwoven with historical patterns of discrimination
(Browne, 2016).
The policy debate over encryption centers on questions about whether and under what
conditions digital information should be obscured by making it indecipherable to anyone who does
40
not have the key to decode it. Privacy advocates posit encryption as an important, if partial,
solution to the harms posed by mass surveillance. In the face of growing incursions on our privacy
by the state and market and insufficient accountability by regulators, encryption can serve as to
bolster the rights of individuals.
By contrast, law enforcement agencies argue that encryption presents an existential
challenge: investigators have come to rely on the ability to wiretap in order to track down people
engaged in violent extremism, using bulk collection and network analysis to map the
communications networks of possible terrorists. They claim that the widespread adoption of
encryption could lead to the data traces produced by these suspects suddenly “going dark”
(Homeland Security Committee, 2016).
These two perspectives illustrate two distinct conceptualizations of the political valence of
encryption. Authorities assert there must be ways of using encryption to protect secrets from
adversary nations while granting law enforcement access, while advocates argue this is not
mathematically possible without weakening encryption such that it could easily be broken by
adversaries. This often resolves into a political stalemate due to differing understandings of what
is both technically and mathematically possible.
In some ways, the encryption debate has displaced the underlying argument over achieving
a balance between the state’s need for national security and sovereignty and the individual right to
privacy.
4
These arguments verge on treating encryption as a teleological goal in itself; what Gürses,
Kundnani and Van Hoboken (2016) refer to as “crypto as a defense mechanism”. By reducing the
argument to technical solutions, this response fails to account for the political nature of the
4
Many encryption advocates contest the notion that there is a binary opposition between privacy and
security – rather, they see encryption as serving both these purposes. For examples, see: Gill, 2017 and
Abelson et al, 2015.
41
surveillance problem, undermining its social consequences and ignoring issues of race, gender,
and class.
Really, these arguments over encryption are not about encryption software itself, but who
has access to information and at what scale. Thus, the encryption debate is central to the field of
communication. It asks, what are the ‘right’ relationships between information and power, and
how are these relationships defined? Understanding the politics of encryption requires teasing out
these questions in a nuanced way, placing them in dialogue with the broader landscape of social
and technological change.
As cryptographer Phil Rogaway writes, “That cryptographic work is deeply tied to politics
is a claim so obvious that only a cryptographer could fail to see it” (Rogaway, 2015, 3). How the
application of encryption embodies specific forms of power and authority matters as much as the
discursive meaning of its hidden text. Interrogating which of these forms are embodied in
particular applications of encryption requires raising questions about who is privy to the secrets it
obscures, how codes and ciphers are designed to obscure information to some and not to others,
and who gains access to the technologies of encryption in the first place.
Chapter Outline
This chapter serves two purposes: first, it serves to explicate encryption as a cultural artifact
and the object of analysis for this project in order to define my terms and ground them analytically.
I unpack three dimensions of encryption as process, artifact, and infrastructure. Then, it situates
encryption in the context of broader, historical trajectories by engaging with several distinct
discourses that have developed around cryptography over time. My argument here is that, in the
42
vein of Langdon Winner, encryption has politics – but its politics are multiple and at times
contradictory.
There is a rich tradition in STS scholarship that examines how discourse shapes and
becomes structurally embedded in technical artifacts, as well as how those artifacts help make new
forms of politics possible (Edwards, 1996; Agre, 1997; Gillespie, 2006). By acknowledging the
multiple forms of values and meanings associated with cryptography, I argue, it becomes possible
to articulate how particular political configurations show up in different types of cryptographic
projects – and reconcile the incongruities between them.
To surface each of these readings, I analyzed canonical histories of cryptography across
the disciplines of computer science, literature, and early modern history, using thematic coding to
identify dominant themes and historical trajectories, and then working within each theme to
construct a narrative that traces the evolution of the thematic material over time. Each of these
readings are intertwined in contemporary policy debates, but may not be clearly identifiable as
such. The work of this chapter is to define and delineate them, illustrating how each reading
suggests different configurations of power and authority associated with encryption.
The remainder of this project will adopt a narrower frame that examines only particular
uses and interpretations of encryption that developed in relation to networked infrastructures in the
late 20
th
and early 21
st
centuries. But cryptography is not purely or solely a form of technology,
nor is it purely a product of modernity. By taking a more expansive view in this chapter, I hope to
open up and broaden the wide range of cultural meanings that have been associated with
cryptography over the course of history, many of which persist today.
43
What is Encryption?
The early modern cryptographer Samuel Morland once described cryptography as a “paper
labyrinth”, designed to draw readers into an interior world of dark twists and turns where the truth
cannot be surfaced without access to a key (Ellison, 2008). Though rarely regarded as such,
cryptography is thus deeply tied to communicative practices.
Most texts on cryptography – its mathematical principles as well as its history – begin with
a brief glossary in terms. They generally start with a statement somewhat like the following, from
the Oxford English Dictionary: encryption is a “Noun. The process of converting information or
data into a code, especially to prevent unauthorized access” (Oxford, 2017). This definition
captures a number of different aspects of the concept: encryption as both an object (Noun.) and a
process (of converting information or data into code). It is often used, as the definition suggests,
“to prevent unauthorized access” – rendering its contents unintelligible to anyone without the key,
or the capacity to break the code.
Encryption is also often inscribed into technical artifacts.
5
Here, two new distinctions are
drawn around what kind of inscription is involved: ciphers, which transpose individual letters in
an alphabet, and codes, which replace entire plaintext words and their underlying meanings (Kahn,
1967). Similarly, to encrypt or encipher something means the process of translating a piece of
plaintext into a ciphered text, while to encode it means to translate the meaning of the plaintext
into code. When it comes to the process of returning a code/cipher to its original plaintext, the
actor’s intent comes into play, as well as the environment in which they are acting: if the person
5
Whether encryption is in fact an artifact depends on its application – encryption is, strictly speaking, a
process, a distinction not to be glossed over. However, in its modern use, the process of encryption is
almost always inscribed in a technology or standard. In my descriptions, I use encryption to refer the
inscription of cryptography in technical artifacts, and use the term cryptography to refer to the wider field
surrounding its theoretical foundation and practical application.
44
has legitimate possession of the key or the system needed to convert the cryptogram back to its
original plaintext, they are deciphering or decoding the text. If they are a third party adversary –
someone without possession of the system or key – they are cryptanalyzing, or codebreaking, the
text.
Finally, encryption is increasingly implicated in infrastructure, and the term encryption is
often used interchangeably with the systems it is built into. Encryption is inscribed in the structures
and technologies of the internet and works invisibly to support our online activities (Star &
Ruhleder, 1996). Encryption technologies are behind every credit card transaction, Bluetooth
connection, and mobile phone call made by billions of people worldwide. They are used during
the authentication of connections, protecting the connection between your computer browser and
the servers of the websites we navigate to. They protect data at rest, ensuring that private
information stored on servers is not easily accessed or changed by third parties.
Each of these infrastructures are applications of encryption, constructed by technologists
and deployed in particular ways. And thus, there are values and ethical principles inscribed in the
depths of the systems that deploy encryption. It is my aim in this dissertation to surface some of
them, to hold them up to the light and ask questions about how and why they were implemented
in particular ways in our networked infrastructure, and not in others.
In summary, these introductory moments in cryptography texts are meaningful for scholars
of communication and technology, because they signal that encryption is not just a technical, but
sociocultural process. The work of cryptographers has always been innately intertwined with the
interrelationships between written language and culture: our ability to communicate with one
another across time and space through writing is accompanied by an inevitable need to retain a
zone of privacy and disconnection. As historian of cryptography David Kahn writes,
45
as soon as a culture has reached a certain level, probably measured largely by its literacy,
cryptography appears spontaneously – as its parents, language and writing, probably also
did. The multiple human needs and desires that demand privacy among two or more people
in the midst of social life must inevitably lead to cryptology wherever men thrive and
wherever they write (Kahn, 1967, 84).
What Encryption Is; What It Should Do
The remainder of this chapter is split in to five sections, each describing and analyzing a
different reading associated with encryption: the state, commerce, the occult, intellectual and
literary pursuits, and democracy. Much of my analysis focuses on the ancient history of
cryptography, which may be less analytically relevant to later chapters. However, my argument
here is that the discourses that emerged in these historical contexts persist in the present day and
are important to understand the recent history of cryptography. In each section, I trace a distinct
reading of cryptography in association with each domain, interrogate the values implicit in them,
and explain how these values surface in contemporary understandings of cryptography.
Encryption and the Occult
The first and one of the oldest domains in which cryptography emerges associates the
transformation of writing with secrecy, magic, and the occult, an association that lives on today as
much in the writing of the thrillers of Dan Brown and his ‘symbologist’ Robert Langdon as in
claims by Google CEO Eric Schmidt that “If you have something that you don’t want anyone to
know, maybe you shouldn’t be doing it in the first place” (Huffington Post, 2010).
46
By associating cryptography with occult practices and linking secrecy with poor moral
character, this discourse works to stigmatize the use of encryption. But in its original form, the
association between cryptography and the occult served a more spiritual purpose. Some of the
earliest uses of cryptography worked to mystify texts, using obfuscation not so much as a way of
masking its meaning from adversaries as a means to add a layer of symbolic meaning to written
words. Early practices include the use of hieroglyphics in Egyptian funerary formulas and rune-
writing in Scandinavia and Anglo-Saxon Britain, necromancers in the Roman empire, and the use
of codes in religious texts such as the Hebrew substitution cipher Atbash, used throughout the
Bible and other Jewish mystic writings to encode the names of important words. The employment
of codes and ciphers in mystic texts became a subject of fascination for the devoted, who developed
a practice of decipherment and interpretation to unlock the deeper meanings embedded in religious
texts.
The association of encryption with religious mysticism took on a darker tone by the 16th
and 17
th
centuries, but not necessarily because cryptography was actually used in occult practice.
6
One of the most famous early modern cryptography manuals, Steganographia, was published in
1606 by the German cryptographer Trithemius. Steganographia was long believed to be a book
about black magic first and a cryptography manual second (Ellison, 2017, Kahn, 1967). For years,
it was held up as an example that tied the emerging discipline of cryptography to the practices of
early modern magic: the historian of cryptography David Kahn writes of this period, “cryptology
had served to obscure critical portions of writings dealing with the potent subject of magic –
6
In tracing this lineage, I left out some critical developments in the history of cryptography; such as its
development in the Greco-Roman empires and role in the Italian renaissance. I address some of these
gaps in my tracing of the other readings of cryptography below. For a more comprehensive history of
encryption, see Kahn (1969), Singh (1999), Ellison (2017), and Potter (1989).
47
divinations, spells, curses, whatever conferred supernatural forces on its sorcerers” (Kahn, 1967,
91). This historical interpretation is understandable – the text of the manuscript makes claims about
instructing the reader in the use of spirits to send messages over great distances. But in the 1990s,
cryptographers finally deciphered the text of its final volume, revealing these interpretations to be
misguided. They found Steganographia to be a text that is centrally focused on cryptography, but
was disguised to be a book purely focused on magic (Reeds, 1998).
The idea that cryptography is an occult practice reflects the idea, as persistent at the time
as it is today, that secrecy is a mark of poor moral character. The sociologist Georg Simmel rejected
this notion, saying that “secrecy is a universal sociological form, which, as such, has nothing to do
with the moral valuations of its contents” (1906, 462). And yet the association of secrecy with
negative social values persists: for example, Facebook CEO Mark Zuckerberg has made statements
that suggest that hiding one’s identity is a sign of a lack of integrity, reasoning that inhibiting
Facebook users’ capacity to obscure their identities will lead to more civil discourse.
Early modern cryptographers attempted to disassociate encryption from the occult by
aligning it with the emerging disciplines of the liberal arts, repositioning practices once considered
to be magic, such as alchemy and astrology, into experimental and scientific practices like
chemistry and astronomy (Ellison, 2017, 72). A major part of this effort required making
experimental philosophy public. Shapin and Schaffer (1985) write of the practice of experimental
witnessing, making the production of knowledge public in order to differentiate matters of fact
from matters of belief. As they put it, “Matters of fact were the outcome of the process of having
an empirical experience, warranting it to oneself, and assuring others that grounds for their belief
were adequate” (Shapin & Schaffer, 1985, 25).
48
This would seem, at first, to contrast with the work of cryptographers, who sought to create
methods for secrecy and non-disclosure. Despite this, Macrackis (2010) finds that during the
Renaissance there was an explosion of interest in secret communication among the same
participants in the scientific revolution who espoused the need for open communication. This may
be attributed this to several factors, including the need to protect trade secrets (a dynamic explored
more thoroughly below) (Macrackis, 2010), political dangers posed to early scientists by
ecclesiastical and civil authorities (Hull, 1985), and the persistence among scientific communities
of the paraphernalia of secret political and religious orders among scientists, such as the adoption
of secret names, emblems, and oaths to brotherhood (Eamon, 1985).
Indeed, the circulation of published texts – encrypted or otherwise – in England during the
17
th
century was in itself subversive. Manuscripts were often spread by clandestine means in order
to evade the eyes of government censors. Secret writing is thus intertwined with the practices of
reading and writing, and made urgent by the widespread availability of printed matter through the
invention of the printing press (Jagodzinski, 1999). As Ellison writes, cryptography “was as much
a global communication system for knowledge sharing as it was also a system for hiding and
concealing cultural secrets. It was as much an attempt to standardize communication across nations,
ethnicities, and languages as it was a means of discriminating between audience members and
preserving cultural difference” (2017, 17). It was only after the practice of reading and writing
became widespread that the concepts of privacy and secrecy finally discarded their occult
associations and developed a more neutral meaning (Jagodzinski, 1999, 24).
But encryption’s occult origins never fully went away. A common argument against the
widespread use of encryption in the present day is that “if you aren’t doing anything wrong, you
should have nothing to hide”. This argument perpetuates the idea that individuals seeking privacy
49
must be undeserving of its protections, an argument that Daniel Solove critiques as flawed, because
it masks the discrepancies in power between citizens and the surveillance state (Solove, 2007).
This information asymmetry is reinforced in contemporary contexts by the technical
practice of surveillance, in which seeking privacy through the use of technologies like the Tor
browser becomes a trigger for higher levels of targeting by intelligence agencies (Cox, 2014).
These ideas are almost never explicitly contextualized historically or tied to the complex set of
factors that tied cryptography to occult practices in the early modern era. But the discursive
association between cryptography and the occult is powerful: despite the efforts of cryptographers
over centuries to establish the practice as a science, it retains the residual mark of these dark
associations.
Cryptography and the State
Another dominant reading of cryptography centers the art of secret writing as a tool of the
state. In this domain, cryptography is used as a strategic advantage over adversaries for states
waging war in a geopolitical battlefield. As Lois Potter puts it in her book Secret Rites and Secret
Writing: Royalist Literature 1641-1660, “Mystery is an advantage for any party in power, and,
since knowledge is power, any party out of power will naturally demand further access to it. At
the same time, any party which is denied access to the open expression of its views will express
them covertly if it can” (Potter, 1989, 209).
The assertion that cryptography has historically been monopolized by state authorities
requires some unpacking, however. The contemporary debates over the legal status of encryption
reveal contradictions between two overlapping perspectives on the proper role of cryptography
within states: cryptography as a tool for national security, and cryptography as a tool for state
50
secrecy. These differing perspectives were once compatible, but are increasingly in conflict with
one another: whichever of them dominates will have important implications for the configuration
of power in the state’s orientation toward cryptography.
Cryptography and National Security
Cryptography is a key part of the apparatus of state national security: whoever has access
to cryptography has a strategic advantage over adversaries by opening up lines of communication
that cannot be intercepted. Thus, many states seek to shore up cryptographic resources by investing
in technologies and in the best minds the discipline has to offer.
Though it is not the only use, most common way cryptography has been used by states is
in the military: for example, Herodotus writes that the use of secret writing saved Greece from
being conquered by Xerxes, the Persian king, when an exiled Greek citizen sent a message in code
to warn the Spartans of Xerxes’ invasion plan (Singh, 1999). It is directly implicated in American
involvement in both world wars; the decipherment of the Zimmermann telegram by the British led
directly to American involvement in World War I and the failure to piece together deciphered
intelligence indicating the attack on Pearl Harbor in time to prevent it led directly to its entry into
World War II (Kahn, 1967). The use of cryptography by military agencies reached a new pinnacle
during the wars, employed by nearly all nations engaged in the wars and codified through the
formation of new agencies devoted to cryptanalysis and cryptography. Modern histories of World
War II attribute the cracking of the Enigma machine as one of the decisive victories that led to the
end of the war, while Sweden used cryptography decisively in order to maintain its neutrality
(Kahn, 1967).
51
But cryptography also has an important national security function during peacetime, and is
a part of the flowering of modern diplomacy between the 16th and 18th centuries: the principle of
secrecy in diplomacy was well-established among European states after the Renaissance (Roberts,
2006), and enacted through the use of encryption of diplomatic communications between
ambassadors and their home states. These communications were sometimes intercepted, opened
and cryptanalyzed by other states on the way, a practice pioneered by the French cryptologist
Antoine Rossignol and institutionalized by the formation of Black Chambers by countless other
states. The historian David Kahn writes that by the end of the 1500s, most European states kept
full-time secretaries who worked to read the ciphered dispatches of foreign diplomats and develop
official codes of their own. The sophistication of a state’s cryptologic capabilities thus became a
strategic advantage not only in war, but in peacetime as well (Kahn, 1967, 106-109, 157-165).
Cryptography in national security is thus about state’s capacity to protect their own
communications and to infiltrate the communications of their adversaries. In this sense, it is zero-
sum: whoever has the most advanced cryptographic systems has a strategic advantage over others,
and can leverage this advantage for both military and diplomatic benefits.
Cryptography and State Secrecy
Cryptography also plays an important domestic function within states, by enabling state
secrecy. Historically, secrecy by the state was meant to symbolize and safeguard the dignity of
rulers and integrity of their functions (Hoffman, 1981), canonized by Tacitus in his history of the
Roman empire under the principle of arcana imperii, or secrecy for the state (Roberts, 2006).
7
7
In the early 20th century, Max Weber observed that the bureaucratic administrations of governments
thrived on secrecy, battling every attempt by legislatures to gain access to information about their inner
workings (Weber, 1946).
52
This orientation toward cryptography also seeks to maintain a state monopoly on the practice, but
to different ends.
One of the earliest examples of the extensive use of encryption by a government can be
observed within the pre-modern bureaucratic systems of the Abbasid caliphate. The Abbasids grew
a vibrant commercial industry through the administration of strict laws and low tax rates. In order
to maintain this system, administrators relied on the secure communication afforded by encryption
to protect their tax records and sensitive affairs of state (Singh, 1999).
More often, secrecy is used to mask corruption and impropriety among sovereigns. For
example, King Charles I of England used encryption extensively in his letters, which became the
subject of intrigue when they were leaked and published in 1645, revealing among other things his
distaste for Queen Henrietta Marie prior to their marriage. The King made the mistake of keeping
unciphered drafts of the letters in his papers, making the decipherment of the remaining texts all
the easier once captured. This led to both embarrassment for the already-encumbered British
royalist cause and, at the conclusion of the English Civil War, his execution for treason (Potter,
1989).
The embrace of secrecy has harmed states’ interests in modern times as well: for decades,
the United Kingdom was unable to claim its invention of the first programmable digital computer.
Because of the secret nature of the country’s advances in cryptography during the war, the UK
destroyed all records of its invention of the Colossus, the programmable digital computer used by
codebreakers at Bletchley Park to decrypt messages in the days leading up to D-Day. For years,
the US-made ENIAC was believed to be the first computer, even though Colossus was operational
53
three years earlier. The machine itself and much of the documentation about it were dismantled or
destroyed after the war and kept secret until the 1970s (Singh, 1999, Coombs, 1983).
The scandals of the 1970s led to an embrace of openness by democratic nations, though
this proved to be short-lived. The Church Committee, formed by the United States Senate found
that secrecy in the Executive Branch had led to widespread abuse of powers, including the
surveillance of civil rights leaders, attempts at assassination of foreign leaders, and a thirty-year
program by the National Security Agency to obtain copies of telegrams leaving the United States
(Schwarz, 2015).
A Task Force on Secrecy concluded in 1970 that “more might be gained than lost” if the
US adopted “unilaterally, if necessary - a policy of complete openness in all areas of information”
(Moynihan, 1997, 61). The findings of the Task Force align with the observations of the sociologist
Georg Simmel that “Democracies are bound to regard publicity as the condition desirable in itself.
This follows from the fundamental idea that each should be informed about all the relationships
and occurrences with which he is concerned, since this is a condition of his doing his part” (1906,
469).
The spread of networked technologies has in some respects opened up unprecedented new
opportunities for intelligence agencies, giving them greater access to collect data. At the same time,
this capacity is not monopolized by the United States. This has led to a fracturing of the discourse
within and between government agencies around the usefulness of encryption: whether or not they
see cryptography to be a friend or foe is closely tied to both their incentives and views on
encryption’s relationship with power and authority.
For example, over the past forty years, the NSA and its UK counterpart General
Communications Headquarters (GCHQ) have sought to limit the use of encryption worldwide: by
54
inserting vulnerabilities into encryption standards (for example, by compromising the random
number generator in the encryption standard adopted by NIST), promoting the use of backdoored
encryption devices (Levy, 2001), and engaging in legal battles to force access to encryption keys
(Harris, 2014). In recent times, current and former NSA officials have expressed support for
adopting a stance that recognizes the benefits of encryption, siding with those who see privacy as
a necessary part of national security, not an adversary to it. This is a view that the FBI does not
share – and neither do the governments of the UK, China, India, Senegal, Egypt, and Pakistan, all
of which passed laws that highly control or criminalize public use of encryption projects or
otherwise enable law enforcement authorities to compel decryption (Abelson et. al, 2015, Levy,
2001). To complicate matters, state secrecy made a forceful return in the years following the War
on Terror, resulting in the expansion of systems of classification and adoption of secret tribunals
to make critical decisions about surveillance authorizations.
Thus, though the narrative of encryption as a tool of the state continues to be a dominant
force in encryption policy, it is increasingly complicated and fraught with inter-agency conflict.
Despite these complications, it remains true that when viewed through the lens of state power,
encryption becomes part of a battlefield of intelligence in which states seek to exploit the
weaknesses of others to their advantage.
Cryptography and Commerce
For thousands of years, businesses have sought to use encryption to protect trade secrets:
the earliest known use of encipherment was employed by a potter in ancient Mesopotamia, who
used it to hide the earliest known formula for making glazes for pottery. Some have speculated
that Da Vinci’s employment of mirror writing was similarly adopted in order to protect his
55
inventions (Eamon, 1985). Encryption is thus not only useful to states, but to businesses seeking
an advantage over competitors. The incentives and power configurations here are broadly similar
to those of cryptography and national security, described above, but surface in distinct ways.
If military intelligence agencies fuel innovation in cryptography, businesses fuel
innovation in its practical adoption. The commercialized use of codes advanced considerably along
with the invention of the telegraph: businesses wanted to take advantage of the rapid
communications afforded by new technologies, but needed a way to do so while evading the prying
eyes of telegraph operators. This led to the development of telegraph codes, an array of new
languages used by telegraph operators to send messages across space and time. The use of
telegraph codes had advantages for businesses in addition to privacy: they enabled companies to
save money by shortening the number of words used to send messages, and reduced the risk of
telegrapher error by eliminating homonyms (Singh, 1999, Kahn, 1967).
Cryptographic inventions were also marketed to businesses by entrepreneurs. The
cryptography industry experienced a period of particular growth immediately following World
War I, with the demonstration of new uses of cryptography during war complemented by the post-
war expansion of international trade. These entrepreneurs grappled with the longstanding question
of whether there is a market in privacy: Kahn writes that “The understandable desire to get rich
motivated the efforts of such men as Damm, Hebern, and Scherbius, the inventors of the rotor; the
AT&T Company promoted Vernam’s machine in the hope of profit. The inventions of these men
enriched cryptography if not themselves, but the efforts of many others contributed nothing either
to cryptology or to their own pockets” (Kahn, 1967, 825).
The example of one of these inventors, Arthur Scherbius, is illustrative: the inventor of the
infamous Enigma machine, Scherbius initially sought to market his invention to both the military
56
and business community. He found few takers among the businesses he spoke with, most of whom
said they could not afford Enigma’s security. Singh writes that though the businesses said it was
too expensive “Scherbius believed that they could not afford to be without it” (1999, 138). Despite
his relative failure in the private sector, he found success before long by selling the machines to
the German military: by 1925, he was mass-producing the machines, which he designed to be
distinct from the few commercial machines he sold before the war.
The importance of cryptography to World War II and institutionalization of its use by
intelligence agencies hobbled its commercial development. States sought to protect their monopoly
on encryption, policing its use in the public sector by asserting the privilege of national security
and hiring young bright minds interested in working on cryptography. Development of
cryptography within the private sector thus largely stagnated until the invention of public key
cryptography by Whitfield Diffie and Martin Hellmann in 1974. Their remarkable achievement
meant that encrypted communications channels could be established between two parties without
having to secretly share their keys beforehand. This led to a renaissance in the development of
cryptosystems by academic and amateur cryptographers, and the growth of interest in their use by
companies. The cryptographers Ron Rivest, Adi Shamir and Leonard Adleman were the first to
patent an implementation of public key cryptography, and licensed it to businesses under the
moniker RSA. Still, for years the limited strength of computing power meant that only large
businesses – those with the money to purchase and use large mainframe computers – were able to
use RSA’s cryptosystem (Singh, 1999).
As computers became both more powerful and cheaper to produce, the value of encryption
to consumers grew. Encryption proved essential to the development of the e-commerce industry:
it was the only way to securely convey payment details over the internet between consumers and
57
companies. In 1999, Simon Singh called encryption “the only way to protect our privacy and
guarantee the success of the digital marketplace. The art of secret communication, otherwise
known as cryptography, will provide the locks and keys of the Information Age” (Singh, 1999,
xiv). The incorporation of encryption in internet infrastructure was interrelated with commercial
motives: it was used where it would facilitate e-commerce, but not in areas that would increase
users’ privacy and security at the cost of business motives.
Much of the encrypted critical infrastructure that makes the internet secure is developed
and maintained by members of the free and open source software community. However,
commercial companies rely on this infrastructure for their own products. While some companies
provide much-needed support by providing salaries to developers who work to maintain the code,
the maintenance of encryption infrastructure remains substantially underfunded. Moreover, the
development of new uses for encryption, particularly uses motivated by privacy and security rather
than commerce, is largely supported by the public sector and foundation grants. This has left the
present-day relationship between encryption and commerce in a complicated place.
Encryption as an Intellectual Pursuit
Encryption is sometimes used in ways that are devoid of any commercial or political
motives: for some, it is a purely satisfying intellectual pursuit. Though much of my discussion of
cryptography thus far has focused on applied cryptography, there is a vibrant field of academic
cryptographers engaged in producing new cryptographic theory unlikely to be used practically but
nevertheless mathematically interesting. This approach to the discipline tends to be disparaged by
applied cryptographers, who are interested primarily in work that is of practical relevance.
58
However, the intellectual tradition of cryptography remains a powerful part of its contemporary
practice, most clearly visible in its ties to literature.
The use of encrypted text in literature has played an important role in inspiring
cryptographers: not only their intellectual interest in codebreaking, but also their ideas about what
cryptography can and should do in the world. William Friedman, widely regarded as America’s
foremost cryptographer, credited Edgar Allen Poe’s tale “The Gold-Bug” for introducing him to
cryptography as a child and sparking his interest in the discipline (Rosenheim, 1997). Poe is lauded
among cryptographers for his experimentation in code-writing in his short stories. His use of
cryptography is relatively apolitical, instead using codes and ciphers as a kind of intellectual game.
“The Gold-Bug” is the most famous of this genre, inspiring many young cryptographers with its
tale of a search buried treasure hidden by the pirate Captain Kidd. Submitted to the Philadelphia
Dollar Newspaper in 1843, the Gold-Bug is Poe’s most widely-read short story and is credited for
popularizing cryptography during the period (Rosenheim, 1997).
Cryptography has also been used both to uphold and debunk conspiracy theories among
literary scholars. Friedman and fellow cryptographer Elizebeth Smith (later Friedman) famously
used cryptography to demonstrate that Sir Francis Bacon was not the author of the works of
William Shakespeare. Employed by Elizabeth Wells Gallup to decode what she believed to be
messages hidden by Bacon in Shakespeare’s works, the Friedmans were able to prove the contrary
using cryptographic techniques. They eventually published the book The Shakespearian Ciphers
Examined, dismissing Gallup and other Baconians’ claims about ciphers hidden in Shakespeare’s
texts and creatively embedding a cipher of their own in the text that spelled out “I did not write
the plays. F. Bacon.” (Kahn, 1967).
59
Codes and ciphers are often used in works of fantasy and science-fiction codes to add layers
of mystification to the text. They evoke possibilities of the existence of a “second world alongside
of the obvious world, and the latter…most strenuously affected by the former”, as Simmel once
put it (1906, 462). In the Lord of the Rings trilogy and its ancillary books, Tolkien uses codes to
embody the fictional domains of humans, elves, and hobbits as much as to illustrate the barriers
between them. The decipherment of these mystical ciphers by the wizard Gandalf emphasizes the
value of the practice of cryptography in being able to move fluidly between different realms.
Similarly, Martin Paul Eve writes of the integral meaning of passwords to the world of Harry Potter
in his book Password, elaborating on Rowling’s choice to use wizards’ wands as a second form of
authentication layered on top of the use of a coded incantation. Eve writes that the use of the wand
provides an objective limit on access to magic beyond the existing requirement of magical
personhood, embodying the central ethical theme of her books: “determining people by any kind
of identity, race or supposed inner essence is problematic and difficult” (Eve, 2016, 54).
The symbolic use of codes is sometimes used in literary criticism. One approach to
deciphering layers of meaning in texts is that of semiotic literary criticism, which reduces literary
texts to their underlying system of signs. Two authors particularly ripe for this approach are
Vladimir Nabokov and Jorge Luis Borges, both of whom purposefully embed codes and signs
throughout their texts as markers for readers. Nabokov famously does so in his short story “The
Vane Sisters” by embedding an acrostic in to the final paragraph of the text that destabilizes the
apparent meaning of the story. Borges uses similar devices in his short stories, as well as deploying
philosophical experiments dealing with codes, as in the short story “The Library of Babel”, which
explores the limits of meaning in coded language by positing a universe constructed from an
expanse of hexagonal rooms filled with books containing every possible ordering of 25 characters.
60
Within the universe the library must contain every coherent book ever written, buried within an
enormous volume of texts filled with gibberish, driving librarians to a state of despair. This
approach engages with cryptography at the level of semantic meaning, both uncovering new depths
in authors’ use of language and challenging its limits.
The use of codes and ciphers in this intellectual tradition reinforces their importance to the
written language. Cryptography can be an effective means for authors to add layers of meaning or
mystification to texts, or to convey ideas in ways that go beyond the words or actions of characters
on their own. They also encourage readers to exercise their minds in relationship to texts in new
ways, for some leading to a life-long interest in cryptography.
Encryption and Human Rights
Finally, encryption can be used in ways that enable resistance by the powerless in the face
of the powerful. James C. Scott writes in Domination and the Arts of Resistance that powerless
groups often use what he calls ‘hidden transcripts’ to enact critiques in the face of the powerful;
using disguised forms of expression such as rumors, gossip, folk tales, songs, jokes, and gestures
to “insinuate a critique of power while hiding behind anonymity or behind innocuous
understandings of their conduct” (Scott, 1990, xiii). Here, encryption is a subversive force that
balances out asymmetries of power.
For example, slaves fleeing the American South through the Underground Railroad were
assisted by coded messages sewn in to quilts, displayed openly by conductors at waypoints on the
trip north. The quilts would indicate safe houses and hiding places, or what kinds of resources
were available to passengers in their travels, and were legible only to those with the ability to read
the codes hidden within them (Rosenberg, 2003). Cryptography was also critical in the early years
61
of the Revolutionary War, enabling the patriots to identify and communicate with one another
across wide expanses of territory. Codes were used by George Washington to manage the sharing
of information across wide-ranging networks of patriot spies after the outbreak of war, enabling
him to send and receive intelligence across enemy lines. The Committees of Secret
Correspondence, the first foreign affairs arm of the new revolutionary government, used ciphers
to bring foreign powers over to the side of the patriots (Nagy, 2010). The ability to communicate
using pseudonyms was also key to the proliferation of the philosophical ideas expounded by the
patriots. Publishing pamphlets pseudonymously enabled the ideas at the heart of the Revolution to
circulate and gain popularity on their merit without the risk of immediate suppression by
Loyalists.
8
Similar tactics are used in the present day by digital activists. For example, Chinese
netizens have developed elaborate systems of coded internet slang known as e gao that can be used
in public on social media platforms to circumvent censorship by authorities. By reappropriating
common terms and their homophones to distort or subvert their commonplace meaning, everyday
citizens engage in resistance against government oversight. One well-known example is a meme
in which netizens adopted the term “river crab” as a stand in for its homophone “harmonious”, the
signature ideology of then-Chinese president Hu Jintao. As the construction of a “harmonious”
society by Hu Jintao came to be accompanied by ever-stricter levels of censorship, netizens began
8
Examples include the Federalist Papers (written by Alexander Hamilton, James Madison and John Jay
under the pseudonym Publius) and Common Sense (written anonymously by Thomas Paine), as well as
the multiple pseudonyms adopted by Benjamin Franklin in his publication of news articles, including
Silence Dogood, Richard Saunders, Anthony Afterwit, Celia Single. For more on the role of pseudonyms,
see DeSeriis, M. (2015). Improper Names: Collective Pseudonyms from the Luddites to Anonymous,
Minneapolis: University of Minnesota Press, and Calaway, J.C. (2003). Benjamin Franklin’s Female and
Male Pseudonyms: Sex, Gender, Culture, and Name Suppression from Boston to Philadelphia and
Beyond. Honors Projects Paper 18, Illinois Wesleyan University.
62
saying that they were “river-crabbed” in place of “harmonized” to signal to others that their words
had been censored (Nordin & Richaud, 2014). The adoption of codes in this manner enabled
activists to communicate outside the purview of increasingly invasive tactics by the state.
Encryption technologies have also proved useful to journalists and human rights defenders
to protect their sources. The most famous of these cases is Edward Snowden himself, who used
encrypted tools to protect his communications with the journalist Glenn Greenwald and filmmaker
Laura Poitras while blowing the whistle on mass surveillance by the National Security Agency.
Encryption enabled Snowden to mask his communications from the NSA long enough to escape
to Hong Kong and publish the initial articles from the files he leaked. But the use of encryption by
human rights advocates can also serve as a reason for oppression by the state: for example, the
Zone 9 bloggers, a collective of journalists in Ethiopia who write about political issues and human
rights abuses, were arrested after using encryption tools to protect their correspondence with
sources.
Despite these examples of the use of codes and ciphers as a means of resistance, it is a
relatively recent phenomenon that encryption has come to be associated with privacy and freedom
of expression. In Chapter 2, I explore how ideas about cryptography and privacy emerged in the
1960s out of fears about the potential misuse of database computing and timesharing systems.
It was only after the Snowden revelations that the United Nations adopted a resolution on
the right to privacy in the digital age, but this right was soon connected to encryption technologies.
In 2013, then-Special Rapporteur on Free Expression Frank La Rue wrote in a report that “States
must refrain from forcing the private sector to implement measures compromising the privacy,
security and anonymity of communications services, including requiring the construction of
interception capabilities for State surveillance purposes or prohibiting the use of encryption”
63
(Human Rights Council, 2013). His successor, David Kaye, went on to link encryption to core
values of human rights, arguing it helps to lower barriers to the free flow of information and create
a zone of privacy necessary to make free expression possible (United Nations, 2015). Amnesty
International has taken this a step further, declaring that encryption is itself an ‘enabler’ of human
rights: “Encryption is a basic prerequisite for privacy and free speech in the digital age. Banning
encryption is like banning envelopes and curtains. It takes away a basic tool for keeping your life
private”, said Sherif Elsayed-Ali, Amnesty’s Deputy Director for Global Issues (Amnesty
International, 2016). Increasingly, encryption is being associated with the normative values of
democracy.
One group of cryptographers and privacy advocates known as the cypherpunks realized the
value of encryption for the protection of human rights in the digital age long before these
international institutions. Though Snowden proved a critical tipping point for their widespread
acceptance, the work of these activists laid the foundations upon which the UN, Amnesty, and
other groups are now building. They saw, where others did not, the dangers of a fully connected
world, and put their hopes in encryption technologies as a means to resist the forces of surveillance.
For decades, they worked to build tools compatible with innovations in networked technologies
that would allow citizens to disconnect, to protect their privacy, and communicate anonymously.
Their work was not without flaws: many of the tools built by cypherpunks were difficult
to use, and they spent relatively little time trying to encourage mainstream computer users to adopt
them. But their vision – of a world in which we as individuals retain agency over our identities and
our communications in an otherwise networked society – remains compelling. They envisioned an
internet that put individual sovereignty, not connectivity, at the center, and in so doing sought to
64
use encryption as a form of resistance against institutional power. How they did so, and why they
failed in bringing this vision in to being, will form the basis of chapter four and five.
Conclusion
As I have explored in depth here, there are several distinct discourses that have emerged
over the course of the history of cryptography, each of which shapes unique interpretations of the
purpose and use of encryption. These different cultural meanings are in part why it has become
the subject of so much controversy: not only do encryption debates center on different ideas about
policy, nor about what is mathematically possible, but they invoke fundamentally different ideas
about the purpose of cryptography: for policymakers attuned to thinking of encryption as a tool
for criminals and terrorists, its value as a tool for the protection of privacy may feel trivial. For
military and intelligence professionals who see cryptography as a valuable national security
resource, it makes sense that it would be regulated in a similar fashion to weaponry. For activists
and human rights defenders who rely on cryptography to safely conduct their work, access to
cryptography is an enabler of democratic freedoms.
Each of these perspectives is informed by particular configurations of access to information,
and ideas about the role of cryptography in a networked society. As I have outlined, cryptography
can serve as a corrective for some of the harms networked communications infrastructures make
possible - namely, that the technologies that connect and empower us can also be used to surveil
and hurt us. By the conclusion of this dissertation I will argue that cryptography can create new
spaces of possibility for communities to form in an environment of mass surveillance; it can enable
those with marginalized identities or marginalized views to create spaces for expression and
cultivate relationships with like-minded individuals.
65
The ideas we develop around the cultural meaning of cryptography will inevitably surface
in what kinds of encryption technologies are built, adopted, and implemented in infrastructure.
They shape the regulatory policies designed to govern them. Lastly, and most importantly for the
purposes of this project, they emerge in our social imaginaries about the possibilities of our
networked infrastructure.
66
Chapter 2: New Directions: The Invention of Public Key Encryption
In 1976, two researchers outlined a radically new approach to encryption. In a paper titled
“New Directions in Cryptography”, Whitfield Diffie and Martin Hellman declared that “we stand
today on the brink of a revolution in cryptography” (Diffie & Hellman, 1976). They were,
paradoxically, both a decade too early and a decade too late in their pronouncement. The
‘revolution’ they declared began in earnest a decade before, when cryptographers began to
consider the implications of cryptography for the era of networked communications. But it would
take a decade longer, with advancements in computing power and the spread of timesharing
systems, for it to come to full fruition. The invention of public key encryption was an important
step along the way, and opened up a new field of potential uses for cryptography well-suited to a
changing technological and political context.
The growth in adoption of networked computing in the 1960s and 70s led to a growing
need for individual privacy and network security. The adoption of timesharing terminals by
businesses and universities brought with it entirely new sets of problems: nascent computer users
faced challenges like computer-assisted fraud, breaches of computer networks, and the interception
and theft of information stored in e-mail programs and computerized databases (Kolata, 1977). At
the same time, US citizens were grappling with the revelation of other kinds of privacy concerns,
as a series of revelations about the scope and scale of government surveillance unfolded in the
press, beginning with COINTELPRO in 1971 and epitomized by the Watergate in the years
following.
It was not clear, at first, that cryptography would have a role to play in the addressing either
of these concerns: at the start of the 1960s, it was largely monopolized by secretive intelligence
67
agencies engaged in covert battle with the Soviet Union and seen as a critical intelligence resource
on the geopolitical battlefield of the Cold War. But in the ensuing years cryptography underwent
a transition from being understood as primarily about national security to being about data security
and, eventually, individual privacy. So too did this change the study of cryptography: the battles
in the 1970s over public key encryption brought the academic discipline of cryptography out of
the shadows and firmly into the light.
Though it would take decades to come to full fruition, the combination of public key
cryptography and networked infrastructure opened up a new realm of possibilities for private
communications by individuals and communities. Cryptography could be used not only to share
secret messages, but to secure and authenticate communications networks, and, eventually, to
enable radically new kinds of social relationships facilitated by technology.
This chapter explores the series of transformations in the social and political meaning of
cryptography in the 1960s and 1970s, spurred on by a growing understanding of the importance
of privacy and security to a digitized society. Employing a social shaping approach that
acknowledges technological change and social behavior as mutually constituted (Baym, 2015;
Kline & Pinch, 1996; Pinch & Bijker, 1987), I draw on archival texts drawn from the collections
of two research labs centrally involved in cryptographic projects, the IBM Research Lab in
Yorktown and the Martin Hellman papers at Stanford University. Analyzing these texts, I consider
how the invention of public key encryption radically changed the social conditions around
cryptography. In so doing, I triangulate my findings from the archive with other texts published in
popular and trade press from the 1960s and 70s, considering how public key cryptography was the
product of a collective realization by computer scientists and the technology industry of the
importance of privacy in the wake of abuses of surveillance powers by government agencies.
68
Shaped by a changing technological and political environment, public key cryptography is an
artifact of the promises and perils associated with computing in an era of sociopolitical upheaval.
Cryptography and the Cold War
The invention of public key cryptography is sometimes described in cryptographer lore as
a spark of inspiration, arrived at – and nearly forgotten – while Whit Diffie walked down the stairs
of his home to get a Coke. But the origins of public key encryption stretch much further back into
the history of cryptography, and the dialectic of openness and secrecy that has characterized the
field since the 20
th
century.
As in much of the 19
th
century, at the turn of the 20
th
century literature on cryptography
was generally easily available to the public. It was a somewhat esoteric subject, but the publication
of works of writers like Edgar Allen Poe’s “The Gold Bug” helped to popularize cryptography
among budding mathematicians and enthusiasts seeking pirate treasure and occult secrets
(Rosenheim, 1997). In the 1910s, a privately-run facility, Riverbank Laboratories, was a leading
center for codebreaking. Riverbank was run by a wealthy benefactor, Colonel George Fabyan
9
,
who by near happenstance employed two individuals who would become the United States’
foremost cryptanalysts during World Wars I and II, Elizebeth and William Friedman. Placing the
Friedmans under the charge of Elizabeth Wells Gallup, Fabyan set them on a failed mission to
decrypt messages alleged to be hidden in the works of William Shakespeare by Sir Francis
Bacon.
10
Though this proved to be a wild goose chase, over the first few decades of the 20
th
century
9
‘Colonel’ was an honorary title, but one that Fabyan stuck to.
10
The Friedmans were employed by Fabyan and Elizabeth Wells Gallup to prove that Bacon had used a
mixture of typefaces in the original Shakespeare folios to embed messages about his ‘true’ authorship. In
their retirement, the Friedmans published a book, The Shakespearean Ciphers Examined, debunking the
claims of their former employers and using their own cipher to bury the hidden message “I did not write
the plays. F. Bacon”.
69
Fabyan amassed a wealth of material on cryptography that made the lab an important resource for
the US military at the outset of World War I (Fagone, 2017). During and after the war, the US and
UK governments took steps to establish their own bureaus of cryptanalysts: the codes and ciphers
division MI-8 (eventually known as the Black Chamber), the Signal Intelligence Service, Bletchley
Park. Across these disparate and secretive sites, government officials began to treat cryptographic
research as a closely-guarded national security resource, critical to the war effort in World War II
(Diffie, in Schneier, 1996).
This view of cryptography carried through to the Cold War. In 1952, Harry Truman signed
a classified order consolidating the various offices and departments working across different
military intelligence organizations into a new National Security Agency, which promptly set about
hiring the best and brightest mathematical minds they could find to do cryptographic research. In
a turn that seems uncharacteristic of an agency whose very existence was classified at the time,
the NSA formed a National Security Agency Scientific Advisory Board, which convened a group
of scientific experts who led a $25 million program called Project Lightning to fund public research
into increasing the speed of computers. Though short-lived, the five-year project resulted in a very
productive period of collaboration between the academic and intelligence communities, leading to
the publication of 160 technical articles, 320 patent applications, and 71 university theses
(Bamford, 1982).
But this period of openness took a different direction following the findings of the Baker
Committee, which convened the nation’s top scientists in July 1957 under the leadership of Bell
Labs’ research chief Dr. William O. Baker to review the nation’s cryptologic capabilities. The
Committee determined cryptography would be one of the most important weapons in the Cold
War, and recommended the formation of an effort much like the Manhattan Project devoted to
70
improving the country’s cryptographic capabilities. To do so, the Agency redirected its funds
toward Project Focus, which would similarly aim to fund research – this time in secret – into
advanced cryptography under the umbrella of the Institute for Defense Analysis, housed at
Princeton (Bamford, 1982). Project Focus instituted classification requirements on cryptographic
research that meant little innovative work could reach the public eye.
With the short-lived exception of Project Lightning, cryptography thus largely fell off the
radar of public research between the late 1930s and early 1960s. This meant that many young
researchers were discouraged from working in the field by the likelihood that any of their published
work would have to undergo classification (Hellman, in Yost, 2004). Moreover, the Agency used
a loophole in the patent system to prevent work on cryptography from being commercialized by
issuing secrecy orders for patents deemed to have implications for national security (Diffie &
Landau, 2007).
11
What little work that was conducted outside of Project Focus began to quickly
fall behind the state of the art (Diffie, in Schneier, 1996).
This period of relative quiet was punctured in 1967 with the publication of David Kahn’s
masterwork The Codebreakers. Kahn’s book is a rich chronicle not only of the long history of
cryptography, but its crucial role in World Wars I and II and the formation of the National Security
Agency. Because it covered these topics, which were highly sensitive at the time, the Agency
sought to prevent its publication. It petitioned the publisher, Macmillan, for prepublication review
without the author’s permission. They were eventually successful in lobbying for the deletion of
several paragraphs, but not in scuttling its publication altogether (Bamford, 1982).
12
11
These orders are still in use; the Federation of American Scientists counts 5,680 secrecy orders in effect
as of the close of FY2016 (Aftergood, 2016).
12
Bamford’s book The Puzzle Palace publishes the deleted paragraphs in full, which explain the
relationship between the US National Security Agency and UK’s Government Communications
Headquarters. See Bamford (1982), pp. 129-130.
71
Clocking in at nearly 1200 pages, The Codebreakers is widely acknowledged to be the
definitive book on the history of cryptography. Kahn became somewhat of a public intellectual in
the years following, writing articles in Playboy Magazine about codebreaking and providing
commentary about crypto policy debates in the New York Times. He also helped to found a new
journal, Cryptologia, devoted to discussions about all things cryptographic. Starting with the
publication of his book and encouraged by his additional body of work, Kahn played an important
role in reshaping the discursive space around cryptography, reigniting it as a subject of popular
interest.
IBM and the Cryptography Research Group
The publication of The Codebreakers coincided with another important move toward
bringing cryptography back in to the public domain: the establishment of a cryptography research
group by IBM Chairman Thomas J. Watson, Jr. at the company’s research center in Yorktown
Heights, NY. As the adoption of mainframe computers and time sharing systems by commercial
firms grew, IBM recognized a need to develop cryptographic techniques for private sector use.
13
Companies were shifting to using time-sharing terminals, which meant that the number of
computer users and the points of vulnerability in the security of networked computing systems
increased. Prior to this, computer security primarily guarded against threats from within, through
13
It is likely that IBM was at least as motivated to develop encryption for government clients: as late as
1974, Lewis Branscomb, then Vice President and Chief Scientist at IBM, said that “market pressure for
data encryption in communications systems was small and would probably grow very slowly, suggesting
that a ‘technology race’ for ever more sophisticated encryption systems in the commercial marketplace
would be very unlikely for some time to come” (Branscomb, personal communication, 1974). An
alternative interpretation is that IBM was motivated in anticipation that courts would hold computer users
responsible for the protection of privacy, also outlined in the Branscomb memo.
72
locks on the doors to computer rooms and security guards. However, with the introduction of time-
sharing it became necessary to consider threats from outside (Yost, 2015).
Decades before the heyday of computer hacking, businesses began to express concerns
about the safety of their computer networks. As TIME Magazine explained, “In today’s world, the
integrity of secret messages can be crucial not only to national security but to commercial and
industrial operations as well. Yet as society becomes increasingly reliant on electronically relayed
communications – and more sophisticated new gadgetry is developed to intercept them – it is
becoming harder than ever to keep a transmitted secret” (TIME, 1978, 5). These fears led to the
emergence of computer security as a field of study in the early 1970s (Bishop, 1998).
IBM was focused on the use of cryptography in an area where security was urgently needed:
electronic funds transfers. It was contracted by Lloyd’s Bank to develop a system of 600 networked
cash-issuing terminals – an early version of the Automated Teller Machine – in London. The first
such terminal had been built by competitor Barclays Bank in 1967, and worked by collecting ten
pound vouchers with an “unbreakable punch code” that were issued to approved customers. A
customer would sign the voucher and place it in the terminal’s drawer to be tested against the code
the customer would enter. (Murray, 2017). IBM, which already operated a computerized clearing
system in Surrey for London’s banks, wanted to create a new version of the terminal system that
replaced paper vouchers with an online real-time system (Batiz-Lazo et al, 2014).
But Lloyd’s executives expressed some reservations about IBM’s proposal: could the lines
transmitting signals between the network of cash-dispensing machines and the centralized
computer monitoring and checking transactions be wiretapped by eavesdroppers? IBM believed
cryptography would present a solution to this problem, and set out to develop a method that would
ensure that the machines could safely transmit requests for cash to and from banking institutions.
73
As IBM Vice President and Chief Scientist Lewis Branscomb described it, “The goal in
cryptography is to render information undecipherable. We don’t need a perfect encryption scheme.
What we do need is an encryption so difficult to decipher that the very small chance of success
isn’t worth the effort” (Bode, 1978).
Horst Feistel, a German-born mathematician who had been working for the US Air Force
and MITRE Corporation, headed a new research group charged with developing the system. Feistel
had long held an interest in cryptography, but had faced challenges due to political sensitivities
around his German origins at the military research centers he had worked at. When he joined IBM
in 1968, he was finally given room to explore his interests fully, working with a team to invent a
new encryption method, named “Lucifer”. IBM filed for patents for the new commercial
application,
14
but were temporarily blocked by an NSA-issued secrecy order (Konheim, 2015).
Once the order was lifted, Walter Tuchman, who managed data security products at IBM’s
System Communications Division in Kingston, took up work on the algorithm, realizing it needed
considerable strengthening before it could withstand massive commercial use. After years of
testing implementations in software and hardware, Lucifer was finally ready for the wider market.
IBM launched three commercialized cryptographic products in 1977: the IBM 3845 and 3846,
which were table-top and rack-mounted data encryption units compatible with all computer
terminals, and the IBM Cryptographic Subsystem, which was designed for use with IBM
computers and data processing networks (Morris, 1977, Bode, 1978). In developing Lucifer, IBM
acknowledged the importance of cryptographic systems to its customers. By 1973, IBM’s Science
Advisory Committee concluded there was a need for a single cryptographic architecture,
14
This patent is filed under #3798359A.
74
technology and product strategy for the company, and decided to set up a Crypto Center
(Branscomb, personal communication, 1973; Kanter, personal communication, 1973).
The company’s shift in strategic focus conveniently coincided with a series of changes in
government regulation of cryptography. The National Bureau of Standards was charged by
Congress in 1965 with developing security standards for the federal government’s use of
computers, which led it to commission a series of studies in 1968 on the government’s need for
computer security (Bamford, 1982). In early 1972, it held discussions with NSA, and received
guidance that encryption should be incorporated in civilian applications.
15
In a Memorandum of
Understanding between NSA and NBS dated September 18, 1972, the two organizations outlined,
“the provision of adequate security measures for the protection of data and control of data access
in data processing systems is vital to the national interest. The federal government…has a basic
responsibility to assure the development and application of appropriate computer security
measures”. Initially, the memo suggests, this was envisioned as a plan to transfer “techniques for
the protection of data and the control of data access which have been developed by NSA” (National
Security Agency, 1972).
By 1972, it concluded that a technical solution should be developed, and issued a call a
year later for an encryption method that would be published as a public standard, and used to
protect the storage and transmission of data government wide (US Congress, 1987). In a status
report outlining NBS progress in acquiring an encryption algorithm for data protection, the agency
outlined that it believes “based on expert guidance, that data encryption is the only acceptable
15
In a in a 1974 memo, Branscomb noted a primary concern was that terminal manufacturers would bring
to the marketplace “encryption algorithms of dubious strength” thus motivating government intervention
(Branscomb, personal communication, 1974).
75
means for protecting data during transmission between a computer and its terminals or other
computers”.
16
IBM won the bid, using a variant of Lucifer termed the Alternative Encryption Technique,
later called the Data Encryption Standard (DES) (Mollin, 2006). The algorithm came under
scrutiny soon after the announcement of the bid, because it reduced the size of AET’s encryption
key from 64 key bits to 56 bits. Stanford Associate Professor Martin Hellman and his graduate
student, Whitfield Diffie, were among the most vigorous objectors to its adoption. They outlined
a series of concerns about the proposed standard, arguing that the reduced key size made it easier
than necessary to conduct a “brute force attack”, through which a computer would try all possible
keys until it arrived on the correct one. Through a series of calculations, they found that it would
take an actor with a $20 million system slightly less than one day to solve the system. Though this
would be economically infeasible for any commercial entities to break, it would make any systems
protected with the standard vulnerable to a nation-state actor. Moreover, they argued, using all 64
key bits would exponentially increase the system’s security, raising the cost to over $5 billion for
a one-day search time (Hellman, personal communication, 1975).
This method of evaluation comes up time and again in the context of cryptography policy:
how much computing power would it hypothetically take to break an encryption algorithm? What
is the cost of this power? Who could afford it, and what motivations would make this worthwhile?
Assessing the strength of an encryption system is as much an economic calculation as a technical
one, and shifts alongside changes in computing technology. This makes knowledge of the
16
I accessed a copy of this report in the Harwood Kolsky papers at the Computer History Museum. An
unknown author, who I suspect to be Kolsky, who then served on IBM’s Corporate Technical Committee,
underlined the words “based on expert guidance” and added “from IBM” to them, reinforcing the possible
existence of a backchannel between IBM and NBS regarding the solicitation of an encryption algorithm.
76
computing landscape – in the present day, ownership of server farms and advancements in
quantum computing – critical intelligence in assessing the strength of encryption software.
Based on his calculations, Hellman felt that it was likely there were other motives at play
in the reduction of the key size. In a letter to David Kahn, he confided that he believed NSA, which
NBS consulted in its choice of a standard, was responsible for the choice of a less-than-optimal
standard, prioritizing its need to be able to solve intercepted communications from other nations
over the security of US commercial systems (Hellman, personal communication, January 16,
1976).
17
His reasoning was motivated by concerns about state security: he worried that the
weakening of a commercial standard would open up the communications of US businesses to
spying by other nations. Since the US had far more computing systems than the Soviet Union,
Hellman later recalled, they feared it would have the most to lose from insecure encryption – thus,
they felt was in the country’s national security interest to strengthen the standard (Hellman, 2004).
Though there is no definitive proof, the notion that there was some level of intervention by
NSA in limiting the strength of DES is persistent among cryptographers. The IBM documents I
analyzed, many of which were given “classified” status within the company, at a minimum confirm
the existence of longstanding backchannel communications between the company and the NSA in
its development. Though there was a compelling government interest in U.S. commercial
enterprises protecting their communications abroad, these interests may not have overcome
intelligence agencies’ demands to limit widespread adoption of encryption or the promulgation of
technologies that made this possible.
17
A memo by IBM researcher Alan G. Konheim following a visit by the IBM team to the National
Bureau of Standards suggests that the NSA was at least strongly supportive of the IBM implementation.
He wrote in the memo that “Various cryptic remarks, winks, and smiles by Messrs. Rosenblum and
Levinson [representatives from the NSA at the meeting] were construed by us (and interpreted by a highly
placed source in Research) to be positive reactions to our cryptographic program” (Konheim, personal
communication, May 30, 1973).
77
Despite a concerted campaign by Diffie, Kahn and Hellman that rallied such institutions as
Alcoa, Bell Telephone Labs, Sperry Univac and MIT in opposition to the standard, NBS adopted
DES in 1976 (Hellman, personal communication, 1976). Commercialized adoption of
cryptography spread rapidly in the years following: the Wall Street Journal wrote of hundreds of
large firms that were encrypting their data: Exxon, Shell, and US Steel used cryptography to
protect computerized personnel files. Ford encrypted administrative memos it sent between its
headquarters and global plants, to prevent their interception. Oil companies began using ciphers to
protect the geological and drilling information stored in their computers. Construction companies
used cryptography to encode bids it sent to countries where the competitors were government
sponsored, and others encrypted messages about executive travel to nations known for terrorism.
The Department of Agriculture even used encryption to secure the information used to make
monthly forecasts of US crop production (Shaffer, 1978).
The leaders in adopting encryption technologies, however, were banks and financial
institutions. They were uniquely motivated; as Richard Shaffer put it in the article, “computers
have largely replaced checks and letters as the means for moving large amounts of money. The
machines are connected in globe-spanning webs of telephone lines, and tapping the lines could
enable someone to steal huge sums”. Shaffer cited the New York-based Greenwich Savings Bank,
an early innovator in the use of debit cards, as one exemplar: the bank followed the lead of IBM’s
client Lloyd’s and encrypted the passwords its customers punched in to the Bank’s new debit card
terminals. The SWIFT computer network also encrypted the messages to and from the 500
international banks it linked, ensuring intra-bank communications were secure. Citibank also
began to encode all traffic on some of its private wires, such as those linking New York City and
London (Shaffer, 1978).
78
Notably lagging were credit monitoring companies: Shaffer cites Equifax and TRW Credit
Data as among those who felt their files were safe enough without encrypting, and even expressed
fear that governments would force them to impose the technique. But they were exceptions to the
rule: by the end of the 1970s, the use of cryptography was commonly used by companies in their
networked communications. The commercial market for products implementing the DES standard
expanded beyond IBM to incorporate other hardware and software products developed by
Fairchild and Motorola.
Still, though the adoption of cryptography grew in the early 1970s it was primarily used by
these firms as a form of data security – keeping their communications and transactions confidential,
and preventing manipulation of databases by the emerging computer fraud industry. Though a
significant expansion of the use of cryptography beyond the realm of the state, its underlying
meaning was unchanged: cryptography was designed for secrecy and security of information. This
would soon shift, the product of changing technological and political conditions.
Cryptography, Privacy and Authentication
The history of Lucifer and DES is often seen as either an important step toward the
widespread adoption of encryption, or critiqued because of the NSA’s attempts to weaken the
standards that would be used by the public sector. But the discursive relationship between
cryptography and individual privacy – not just computer system security or nation-state secret
keeping – was also strengthened as a result of the debate over the standard.
IBM Chairman Thomas J. Watson Jr. was an early proponent of the view that computer
professionals had a responsibility to protect users’ privacy. In a 1968 speech before the
Commonwealth Club of California, Watson said that “we in industry must offer to share every bit
79
of specialized knowledge we have with the users of our machines – the men who set their purposes
– in a determination to help secure progress and privacy” (IBM, 1970). This notion of privacy as
a social issue of broad importance also shaped how IBM conceived of data security. In a brochure
titled “The Considerations of Data Security in a Computer Environment” the company guided
systems designers to “proceed with a recognition that the protection of individual privacy places
special responsibilities on those who determine how systems are to be used” (IBM, 1970).
18
IBM
saw privacy and data security as related, but ultimately distinct, domains of concern, and launched
a five-year effort to study and develop implementations of data security products in 1972.
Feistel took a different view. He wrote about Lucifer in 1973 in an article in Scientific
American, a relatively rare public discussion of cryptography at the time. But rather than
discussing Lucifer in the context of IBM’s business interests in securing computing systems,
Feistel instead integrated these concerns by outlining an urgent need to protect individual privacy
from the threats posed by computer systems, and particularly databases containing personal
information.
“Since many computers contain personal data and are accessible from distant terminals,
they are viewed as an unexcelled means of assembling large amounts of information about an
individual or group,” he wrote. He described the “dangerous ease” with which dossiers could be
18
In relaying the findings of this chapter to a member of the digital rights community toward the end of
my fieldwork, I was reminded that IBM was also involved at this time in providing computing
technologies to the South African government that were used in the administration of racialized identity
classifications under apartheid. The company’s activities thus did not necessarily follow from these
assertions by officials like Watson, nor did conceptualizations of the ‘user’ of their machines necessarily
include all citizens. For more on IBM’s activities in South Africa see: Human Rights@Harvard Law
(n.d.) In re South African Apartheid Litigation, HumanRights@Harvard Law. Retrieved Apr. 9, 2018
from http://hrp.law.harvard.edu/areas-of-focus/previous-areas-of-focus/in-re-south-african-apartheid-
litigation/. Geoffrey Bowker and Susan Leigh Star also discuss race classification under apartheid in
chapter 6 of their 1999 book Sorting Things Out, Cambridge: MIT Press.
80
compiled on citizens, noting that computers made it easier than before to gather together
information from disparate locations to monitor citizens (Feistel, 1973).
Feistel’s suspicions of government may be a product of his experiences as a German émigré.
Born in Berlin, he left Germany during the years of Hitler’s rise and immigrated to the United
States in 1934. He was placed under house arrest in his home in Cambridge at the outset of the
war, but was granted citizenship and a security clearance to work for the Air Force on secretive
research projects in the years following. Though he worked for the US military for decades, he
would have been familiar with the abuses of power that can result from the excesses of surveillance.
His concerns were also reflective of the times: though Feistel does not comment directly
on them, his article followed a series of scandals about government surveillance, such as the 1971
exposure of the Federal Bureau of Investigation’s COINTELPRO program by the Citizens’
Commission to Investigate the FBI, which showed that the agency had compiled thousands of
dossiers on members of the Civil Rights Movement and New Left organizations. The Watergate
investigations into wiretapping in the White House were also well underway by 1973, with the
Senate’s televised hearings into the scandal starting only two weeks after the publication of
Feistel’s article.
Paul Armer, then a fellow at the Center for Advanced Study in the Behavioral Sciences
(and formerly the director of the Stanford Computing Center) wrote an article in the magazine
Computers and People describing the state of the art in computing technology as “a most important
sub-set of surveillance technology”. Armer warned that the advances in computing technology of
the time – improvements in computing power, introduction of microprocessors in everyday devices
like trucks and appliances, and networking systems like the ARPANET – would have profound
effects on individual privacy. He was particularly concerned that the very kinds of electronic funds
81
transfer systems Feistel was at work building were primed to become “the best surveillance system
we could imagine with the constraint that it not be obtrusive” (Armer, 1975).
19
Feistel believed cryptography could present a potential solution to these incursions on
privacy. It would limit abuse of such systems by ensuring that only those who are meant to have
access could gain access to the information, making it difficult for intruders to take advantage of
computerized databases for nefarious purposes. Moreover, cryptography would make it possible
to protect against attempts to alter the information in a system, thus hindering computer-assisted
fraud. Like Armer, Feistel was concerned with the potential for deception through database
systems: as he described in a classified report to IBM, “machine communications systems, in
contrast to systems which can enlist the subtle filtering capabilities of the human brain are very
sensitive to interference and deception. Without special protection computers are easily fooled and
this can become an intolerable burden to a data bank operation if this remains unnoticed” (Feistel,
1970). He concluded that cryptography could be used not only to ensure confidentiality in
communication, but to provide privacy and authentication for communities of databank users.
These new ideas for cryptography as a form of proof and a means for privacy were
reinforced by the other cryptography enthusiasts engaged in the debate over DES. David Kahn
noted in an op-ed in the New York Times, “Like people, computers talking to one another can be
wiretapped…this has led to demand for a common cipher – a system that would both permit
intercommunication among computers and safeguard the privacy of data transmissions”. He went
on to consider the broader effects of computer networking for surveillance:
Why should the National Security Agency be so passionately interested in the 56-bit key
that it asked to attend a meeting that Hellman set up on the question and flew a man across
19
Thanks to Finn Brunton for pointing me to this article; his forthcoming book Digital Cash: A Cultural
History delves into this anecdote in more detail.
82
the country for it? The N.S.A. expert declined to say. But one obvious reason is that, with
a solvable cipher, N.S.A. would be able to read the increasing volumes of data that are
flowing into the United States time-sharing and other computer networks from abroad.
(Kahn, 1976)
In a letter to US Secretary of Commerce Elliot Richardson, Hellman agreed that DES “may pose
a threat to individual privacy”, as it made it possible for NSA to “misuse its ability to delve, almost
at will and undetected, into the supposedly private files of other agencies” (Hellman, personal
communication, 1976).
These fears proved to be warranted: in 1977 a Congressional investigation meant to inform
an update to the country’s wiretap law revealed the National Security Agency had indeed
developed substantial dragnet eavesdropping capabilities. For years, the investigation revealed, the
NSA had been collecting telegram traffic in bulk, storing it in computers, and using lists of key
words to search and retrieve messages deemed suspicious – the kinds of capabilities Feistel warned
were possible with “dangerous ease” with new database computing capabilities. The NSA was
also caught illegally monitoring the phone conversations of US citizens (Volkman, 1977). An
article in Science noted that NSA capacities were helped by three technologies: the capability to
send multiple telegraph messages on a single stream, sorting them at the receiving end, the growth
in computer storage capacity, and the ability to retrieve in precision selected information from
databases (Shapely, 1977).
As Feistel, Kahn, and Hellman made clear, by the early 1970s there was an urgent need for
protections against the kinds of incursions on privacy made possible by database computing
systems. Though DES was largely described as a mechanism for data security, Feistel’s article
lays out this alternative vision: cryptography could also serve to protect individuals’ privacy, and
83
prevent the risk that the information in databases could be tampered with (Yasaki, 1976). The
invention of Lucifer proved to be an important, but ultimately only partial, step toward addressing
the problems he outlined.
New Directions
The invention of public key cryptography would soon solidify the relationship between
cryptography and privacy, dramatically changing the social conditions around data encryption.
Like Feistel, Diffie and Hellman recognized a commercial need for encryption outside of the
auspices of military intelligence. As Hellman later described it, “The fact that IBM was spending
a huge amount of money on cryptography told me there were commercial applications for it”
(Hellman, in Yost, 2004). But though they were in dialogue with IBM cryptographers, the
company was discouraging of additional research on cryptography after it had finalized its work
on Lucifer – it was more interested in research on operation systems security.
Moreover, the well-resourced National Security Agency was still generally regarded to be
the hub for innovation in cryptography. Hellman later recounted that though there were a few
crypto enthusiasts who had emerged in the years following the publication of David Kahn’s
seminal cryptographic history The Codebreakers, cryptography was not particularly popular
among academic researchers at the time, in part due to fears that new discoveries would
immediately be classified and kept from public view (Hellman, in Yost, 2004).
Against these odds, Hellman and Diffie worked together on what they hoped at the time
would be the foundation for an entire new theory of cryptography. The product of their work,
“New Directions in Cryptography”, is a reflection of their goal to revolutionize the field, outlining
how both technological and economic change opened up new possibilities for incorporating
84
cryptographic devices in commercial applications. “We stand today on the brink of a revolution in
cryptography”, they announced, arguing that cryptography would be transformed from an “ancient
art into a science” (Diffie & Hellman, 1976). Despite their lofty rhetoric, as Hellman recounted in
an interview decades later, the authors were dismayed that the work they presented in “New
Directions” was more modest than they’d hoped, a perspective that seems almost anachronous
given the body of work the paper inspired (Hellman, in Yost, 2004).
Public Key Architecture
The design of public key cryptography is deserving of deeper analysis, because it embodies
a shift away from a one-way, transmission model
20
and toward a two-way, communicative model.
This change in architecture had broader, social and organizational implications. Over most of its
history, cryptographic systems have required carefully guarded systems for controlling the
distribution of decryption keys so that they are not lost or uncovered by third parties – if a key is
lost, third parties have access to decoding the secret message. At the same time, in order to make
cryptography functionally useful there must be a channel established to share decryption keys with
the party you wish to communicate with, and to update them if they ever change. This means that
the parties who wish to communicate in secret must establish some additional secret channel to
share their method. This is commonly known as the key sharing problem.
20
This is sometimes referred to as symmetric key cryptography, denoting that the same key is used to
encrypt and decrypt the message, while public key cryptography is known as asymmetric key
cryptography. I opt not to use these terms here as they are somewhat confusing for the argument I am
making: the object of my analysis is the relationship between communicating parties, rather than the keys.
85
Figure 2.1: Symmetric key crypto system: the same key is used to encrypt and decrypt the message.
Source: Nakamoto Institute, 2013
A number of different solutions to the key sharing problem were developed prior to Diffie
and Hellman’s paper: one highly secure method, if used perfectly, is to use one-time pads, a deck
of encryption keys that are each used and disposed of after a message is sent. But using one-time
pads effectively presents organizational challenges: over time, there have been many examples
where the transmitters of messages re-used keys for the sake of convenience, not knowing that a
third party adversary had cracked the code. Solutions like this seemed to work best in
organizational structures that are hierarchical, that enable oversight and control of the management
of keys, and operate with a high degree of trust between parties.
This is, in part, why cryptography has historically proven such a good fit for organizations
like military and intelligence agencies: their organizational structures are well-suited to the
management and distribution of keys. But globalization introduced strains to these systems: they
required channels for communicating keys to reach farther and incorporate more people into the
system to work effectively, increasing the risk of error. Moreover, they limited the practical utility
of networking technologies.
86
Public key cryptography proposed a new method drawing on basic principles of
mathematics – one-way functions – in order to enable keys to be shared in public. Rather than
distribute keys through extensive secret channels, public key cryptography splits the key into a
public and private version. The sender then obtains the public part of the receiver’s key and uses
it to encrypt their message, transforming its contents through one-way mathematical functions
difficult to crack, so that only the receiver’s private key can be used to decrypt it.
Figure 2.2: Public key crypto system: a combination of public and private keys are used to perform
and solve a one-way mathematical function to decrypt the message. Source: Nakamoto Institute,
2013.
If one-way systems lend well to hierarchical organizational structures, the introduction
public key cryptographic systems opened up new kinds of social relationships that could be
fostered through cryptography. As one writer in Science put it, “This proposal may be arriving just
in time to overcome the massive logistical problems that exchanging codes will pose if
computerization of communications continues as expected” (Kolata, 1977, 747). Initiating
encrypted communications was no longer a one-way affair that required cipher systems to be
shared clandestinely – in fact, it no longer required that the communicating parties even know one
87
another in order to encrypt their communications. Private conversations could be conducted in
public view, on the basis of mutual trust in the cryptographic system. This materially challenged
the idea of cryptography as being solely and fundamentally about secrecy; it could instead be about
privacy, security, and communication.
Public Key Infrastructure
These transformations in the social relations around cryptography were not the product of
technical architectures on their own, they were bolstered through sociotechnical systems known
as public key infrastructures, which emerged in the years following the publication of Diffie and
Hellman’s paper. Public key infrastructures are systems through which public keys and certificates
can be shared and managed. If you want to send an encrypted message to an individual, but they
haven’t shared their public key with you, they may have uploaded their key to a database where
you can find it. If you’re not sure whether to trust the key in the database, you might check whether
anyone you know has signed, or verified, that the key matches the individual claiming it. At the
heart of public key infrastructure is thus another important principle of public key cryptography:
keys can be used not only to encrypt messages, but to authenticate the identity sender through
sociotechnical systems designed to build trust. In addition to enciphering the material, checking a
message against the sender’s public key allows the recipient to verify that the sender is who they
say they are.
There are a variety of approaches to public key infrastructure that have developed for
different purposes. In some systems, institutional third parties take responsibility for management
and verification, such as certificate authorities that verify the key of the web servers sending web
page information to your browser. In other systems, networks of relationships enable keys to be
88
shared and verified, such as the Web of Trust infrastructure used in the e-mail encryption system
PGP. Though there are now other systems for key verification, since its earliest days some PGP
users verified one another’s keys at key signing parties – which serve not only to reinforce the trust
paths in the network of PGP users, but are a way to socialize with and get to know other crypto
enthusiasts. This played an important role in establishing a sense of a crypto community among
users, and expanded trust within the community by ensuring that even if you haven’t personally
verified the public key of the person you are sending an email to, you may know someone you
trust who has.
Both of these examples illustrate how public key encryption turns the foundational
principle of cryptography on its head: where cryptographic systems had historically been built on
principles of asymmetry and distrust – trying to protect the integrity of a code system against
adversaries seeking to break it – public key cryptographic systems are based on principles of trust,
mutuality, and transparency. Rather than engineer the need for trust out of cryptographic systems
entirely, public key infrastructure is a sociotechnical approach that demands users engage with and
vouch for one another. As Jean-Francois Blanchette describes it, “Cryptographic tools and
knowledge would thus move from a dysfunctional institutional context, dominated by the needs
of states for self-protection, to one regulated by the scientific ethos of openness” (Blanchette, 2012,
40).
In a sense, this bears similarities to another system that would be familiar to Diffie and
Hellman: academic peer review. Though the author does not know the identity of those reviewing
the work, the editor assigning it ‘vouches’ for their expertise and the value of their feedback. And,
just as academic peer review contributes to the sense of taking part in an intellectual community,
public key infrastructure played an important role in building a sense of there being a ‘crypto’
89
community. For example, by making visible the trust paths between users who have signed one
another’s keys, PGP users gain a sense that they are part of an expanding network of adoptees and
a community of like-minded individuals who share similar interests and values.
Public Key Cryptography and Academic Freedom
Ironically, the academic publishing system held back the publication of papers outlining
public key cryptography in its earliest years. Hellman’s student, Ralph Merkle, developed an early
version of public key distribution in a paper, but was foiled by the long publishing calendar of the
journal he submitted his work to. Though the Merkle and Diffie-Hellman papers were submitted
at the same time, the publication Diffie and Hellman submitted to decided to fast-track the
evaluation of their manuscript, while Merkle’s took two years to publish. Hellman later described
this as a “quirk of fate” that meant that Merkle’s contributions to the paper were overlooked.
Other members of the academic community also played important roles in public key
cryptography’s invention. Hellman cites John Gill, an associate professor at Stanford and one of
the first black graduates of Georgia Tech, as the person who came up with the idea of using
factoring or modular exponentiation/discrete logarithms for the one-way function in the paper.
Ron Rivest, Adi Shamir and Leonard Adleman, all at MIT, also played a crucial role by developing
a practiceable implementation of Diffie and Hellman’s system. They were quick to patent their
ideas, and eventually built a successful business, RSA, off of the scheme, demonstrating its
commercial viability.
The nature of the invention of public key cryptography as a concept that emerged out of an
academic community destabilizes the common notion that it was a ‘great man’s invention’ or
embodied a singular big-bang moment for the field. As Jean-Francois Blanchette writes, “What
90
perhaps secured ‘New Directions’ such a lofty place in the pantheon of cryptography was its
authors’ ability to situate their discoveries in the context of a broad historical progression”
(Blanchette, 2012, 40). Both the history and architecture of public key cryptography foreground
the value of collaboration. Though Diffie and Hellman are rightly lauded for their invention,
Hellman also emphasizes that public key cryptography is, at least in part, the product of work by
a community of researchers to a widely-known problem.
Crypto Wars 0.1
Hellman wrote years later about the invention of public key encryption:
…We were almost being channeled in that direction, and what I say when I give this talk
on the evolution of public key cryptography, I say, ‘Well, initially your reaction may be
how did they come up with something so earth shattering, so ground breaking, so different
from what was before?’ But after I describe all the threads that were leading us there
consciously or subconsciously, I hope your reaction will be, ‘Why did it take them so long?’
(Hellman, in Yost, 2004, 24)
Why was it that public key cryptography emerged in the mid-1970s, and not at another time? The
key-solving problem had proved a persistent challenge in cryptography over its centuries-long
history. Why then did it take so long for a solution to emerge, and even longer to come to fruition?
Competition
One possibility is that public key encryption’s development was actively hindered by
competing parties in both the public and private sector. For one, there was already a competing
standard in DES, with the weight of both IBM and the US government behind it. DES adapted
91
cryptography as it is conventionally understood for use in computers, itself a novel approach.
Given IBM’s centrality to computing at the time, it is unsurprising that DES had considerable
weight. Standards are powerful forces within technical processes (DeNardis, 2009), and can carry
a kind of political agency by inscribing the ideas and viewpoints of their creators. Contests over
standards can thus become a site for “politics by other means” (Abbate, 1999).
Moreover, the inventors faced competition of another sort, though they could not have
known it at the time. They were beaten to the punch by other cryptographers who had already
developed a similar cryptographic system a few years earlier – but at a moment in time and in a
context in which the potential uses of public key encryption were difficult to envision. James H.
Ellis, a researcher at the British signals intelligence agency Government Communications
Headquarters, conceived of a system of “non-secret encryption” as early as 1969, but couldn’t
solve how to implement it. When he brought it to his supervisors, he was told that it was “garbage”
(Espiner, 2010). Ellis moved on, but his colleague, Clifford Cocks, picked it back up a few years
later at the impetus of a fellow mathematician. Cocks successfully developed an implementation
of Ellis’ ideas essentially similar to what became known as RSA, a commercial implementation of
public key cryptography I elaborate on in more detail in Chapter 3. Cocks judged it to be most
important for military use, and again kept the discovery secret. Years later, he recalled he could
not foresee its implications for financial transactions online until the invention of the World Wide
Web in 1989 (Espiner, 2010).
The invention of ‘non-secret encryption’ reveals important aspects of the secretive
environment around cryptographic research engendered by the intelligence agency. Both Cocks
and Ellis later claimed that they were unable to resolve important issues because they could not
publish their ideas. As soon as the research on public key cryptography was out in the open, many
92
of these challenges were solved relatively quickly. This suggests that despite the intelligence
agencies’ efforts to build a monopoly on cryptographic innovation by making investments in the
best and brightest mathematical minds the nations could offer, the cryptographers asserted the
closed nature of their working environment hindered the academic creativity of their work.
Academic Freedom
Another reason that public key cryptography was slow to take off was the continued intimidation
of academics pursuing cryptographic research. Hellman and his students Steve Pohlig and Ralph
Merkle planned to present their work on public key encryption at the International Symposium on
Information Theory in October 1977, but received pushback in the form of a letter written by a
man named J.A. Meyer to the IEEE, saying:
I have noticed in the past months that various IEEE Groups have been publishing and
exporting technical articles on encryption and cryptology—a technical field which is
covered by Federal Regulations, viz: ITAR (International Traffic in Arms Regulations, 22
CFR 121-128)…these modern weapons technologies, uncontrollably disseminated, could
have more than academic effect (Meyer, personal communication, 1977).
The IEEE replied to the letter stating that it had determined its publications were exempt, but it
nevertheless engendered concern among members of the group. IEEE’s Director of Technical
Activities, Nirendra Dwivedi, urged the scientists to clear their papers with their companies or
submit them to the State Department for vetting prior to publication.
In the meantime, Gina Kolata of Science Magazine took up the case, uncovering that Meyer
worked for the NSA. After contacting the agency’s public affairs office, Kolata received an official
statement asserting that Meyer wrote the letter as a private citizen and not in his capacity as an
93
NSA employee. Kolata interviewed the State Department Office of Munitions Control and found
that nearly every interpretation made in the Meyer letter was inaccurate: publications made
available to the public were exempted from ITAR. However, if publications were submitted to the
Office, as Dwivedi suggested they do, the NSA would gain control over the work, because the
Office would refer the papers to the NSA for rulings. As Kolata put it, “Meyer was proposing, in
effect, a censorship system by the NSA over the research of the Information Theory group” (Kolata,
1977b).
In response, Hellman turned to Stanford University Counsel John Schwartz for his
determination. In a letter to Schwartz, he made a case for the value of public domain crypto
research:
We were motivated to work in this area by the growing commercial need for data security
and encryption, and the almost total lack of unclassified knowledge which could be applied
to these needs. While there is a body of classified knowledge, it is unavailable for
commercial user…Because of this motivation we have made all of our research in this area
freely available through publication in reputable technical journals. (Hellman, personal
communication, 1977).
Hellman described the “understandable” desire for the NSA to maintain its monopoly on
cryptographic knowledge, but outlined several reasons why the conditions that made such a
monopoly tenable during World War II had changed:
First, there is a commercial need today that did not exist in the 1940’s. The growing use of
automated information processing equipment poses a real economic and privacy threat.
Although it is a remote possibility, the danger of initially inadvertent police state type
surveillance through computerization must be considered. From that point of view,
94
inadequate commercial cryptography (which our publications are trying to avoid) poses an
internal national security threat.
A second difference between the situation in the 1940s and today is the relative ease of
building cheap, highly secure cryptographic devices…Complex cryptographic devices
which are impossible to break would have been impractically expensive and unreliable in
the 1940s, and any edge that a nation possessed in cryptographic knowledge gave it a
tremendous advantage. Today, these “complex” devices can be built on a highly reliable,
single integrated circuit which sells for $10 in large quantities...
The third difference with the 1940s is that we are not currently involved in a hot war where
lives depend on cryptanalytic intelligence. (Hellman, personal communication, 1977).
Schwartz concluded that Hellman and his students could legally publish and present their work,
but added, “If you are prosecuted, Stanford will defend you. But if you’re found guilty, we can’t
pay your fine and we can’t go to jail for you” (Corrigan-Gibbs, 2014). They concluded that if
ITAR was construed broadly enough to cover their work, it was unconstitutional, but were wary
of the fact that the only way to determine that would be in a court case (Hellman, in Yost, 2004).
Schwartz recommended against Merkle and Pohlig presenting, even though it was the norm
for student authors to lead presentations. Hellman had the benefit of the tenure system to support
him if he were to get in to any trouble. Hellman left the decision up to the students, who at first
said they wanted to go ahead. However, after speaking with their families, they changed their
minds and said they’d let Hellman present on their behalf (Hellman, n.d.).
The papers were ultimately well-received at the symposium, and Hellman and his students
emerged without any further harassment. Hellman later said that he received evidence that the
highest echelons of NSA were extremely troubled by the publication, even though they had
95
suggested Meyer had acted alone (Hellman, n.d.). Nevertheless, the episode marked an important
shift in the social conditions around cryptography, bringing the field of study back in to the open.
As Stanford Magazine described it decades later in an article commemorating the work: “That a
group of nongovernmental researchers could publicly discuss cutting-edge cryptographic
algorithms signaled the end of the US government’s domestic control of information on
cryptography” (Corrigan-Gibbs, 2014).
Encryption and Technological Change
A final, crucial component influencing the adoption of public key encryption was that of
technological change: though the increased use of computers by the late 1970s meant there was a
larger market for computer security, computers were not yet powerful enough to be of practical
use for implementing public key encryption software requiring large amounts of memory. This
proved nearly fatal for RSA, which nearly went bankrupt in 1986, only four years after the business
was founded. It was rescued by the hiring of Jim Bidzos as CEO and won its first major contract
with Lotus Notes in 1986, but only really took off when it was adopted for internet security
protocols (RSA Data Security, 2002).
21
It would take the widespread adoption of communications networks and increased
computer power to bring public key cryptography into full fruition. The Stanford researchers
responsible for its invention were at the vanguard of communications networking, and their
inventions were shaped by their early use of the ARPANET. As Hellman put it years later, “…it
was much before the Internet became the Internet when I was aware of the promise of computer
21
See: Linn, J. (1989) RFC 1113-5: Privacy Enhancement for Internet Electronic Mail, Parts I-III, IETF.
Retrieved Dec. 22, 2017 from https://tools.ietf.org/html/rfc1113, https://tools.ietf.org/html/rfc1114, and
https://tools.ietf.org/html/rfc1115.
96
communications networks. I was seeing communications and computations coming together”
(Hellman, in Yost, 14). But as I will discuss in the next chapter, network adoption still had a long
way to grow before public key cryptography could fulfill this vision.
Conclusion
The invention of public key cryptography radically changed the use of encryption in the
internet era. As I have chronicled in this chapter, its invention was marked by a contestation over
openness and secrecy in the field. Despite the efforts of government agencies to retain their
monopoly over cryptography, public key cryptography capitalized on a brief moment of openness,
eventually bringing the study of cryptography fully in to the light. Through key victories such as
the presentation of cryptographic work in public forums, Hellman and his students established that
cryptography could effectively be studied outside the auspices of military intelligence agencies.
Moreover, Feistel and his compatriots at IBM, and later Rivest, Shamir and Adleman,
demonstrated the commercial viability of cryptography in the tech sector.
Crucially, the invention of public key cryptography was the inflection point for a series of
changes in the social conditions around cryptography, as commercial firms confronted the need to
adopt cryptographic solutions to their growing problems of computer security. Lucifer was a
precursor, demonstrating that cryptography could be implemented in computerized systems and
backed by the weight of industry players like IBM and government institutions like NBS. But
public key cryptography ultimately posed the solution most compatible to the needs of networked
computing systems, by facilitating the adoption of secure cryptographic systems that could be set
up in the open.
97
This points to a final, and perhaps most important, shift for the meaning of cryptography
during this period: the evolution from conceptualizing cryptography primarily as a technique for
secret-keeping and data security to using it for the purpose of protecting individual privacy in the
face of technological change. This arose out of a series of interrelated problems that together
pointed toward a need for a technical solution to the privacy needs of citizens: the rise of database
systems making information more easily searchable, the growth in production of information as a
result of computer networks, and diminishing trust in institutions resulting from government
wiretapping scandals. Public key cryptography offered an implementation that fostered mutual
trust over individual sovereignty, a notion particularly attuned to the communitarian values of the
1970s. But this required a radical rethinking of the value systems of the field, which was not
quickly adopted by all stakeholders involved. Public key cryptography would take several more
years to come to full fruition, and, as I will argue in the remainder of this dissertation, its
implications for a networked society are still the subject of contestation.
98
Chapter 3: The Crypto Wars and Cryptographic Infrastructure
This chapter examines the evolution of a commercial market in cryptographic software in
the decades following the invention of public key cryptography. I focus on two examples: an
encryption algorithm known as RSA developed by a trio of computer scientists at the
Massachusetts Institute of Technology, and an e-mail encryption program called Pretty Good
Privacy written by independent software developer Philip Zimmermann. Tracing through the
intertwined histories of these two cases, I explain how commercial and regulatory influences
combined to shape the use of cryptography in the 1980s and 1990s.
This chapter is an illustration of the process of imbrication, or how social and material
agencies become entangled in sociotechnical systems (Leonardi, 2012). The case studies of RSA
and PGP illustrate two distinct types of entanglement: first, they demonstrate how an author’s
intentions can be persistently embedded in the structure of software code: RSA was primarily
designed for commercial use from the outset; though the company later sought to cultivate a
reputation as an advocate for the rights of software publishers, it was always, first and foremost,
designed for corporate adoption. By contrast, though PGP developer Phil Zimmermann hoped to
make money from his software, it was always designed for the purpose of protecting users’ privacy.
In both instances, the intentions of the authors had a downstream effect on how the code was used,
accelerated by influences and pressure from a variety of market and regulatory actors.
Second, these cases demonstrate the combined force of the market and law in advancing
particular uses and understandings of cryptography. By the close of the 1990s, encryption was in
widespread use among internet users, but was primarily designed around providing security for
financial transactions necessary to make e-commerce possible. While developers like Zimmerman
99
still actively built encryption designed to protect the privacy of users’ online activities, this
software was pushed to the margins by the active policing of patents by competitors and its unclear
legal status.
Untwining the entangled dynamics of market, law, and authorship is particularly
consequential for this period in the development of public key cryptography. During the 1980s and
1990s, encryption began to take on the functions of infrastructure – embedded, often invisible
algorithms deeply imbricated in securing software code, made necessary for many software
systems to ‘work’. They are also important forces in structuring society (Edwards, 2003). Here, I
argue that the particular ways that encryption was built into infrastructure worked to encourage
the development of the e-commerce industry, while relegating privacy protective uses to the
margins. But infrastructures are also tools in practice, always in the process of becoming, and are
mutually shaped by their social and technical dimensions (Star & Ruhleder, 1996). What factors,
then, were at work in shaping cryptographic infrastructure in these particular ways?
One possible answer, I argue, is the constant possibility of regulation, a persistent threat
that was never definitively resolved. A running theme of this dissertation is that since the early
20
th
century, the US government has held an uneasy position with regard to use of cryptography
in the public sector. As I described in the last chapter, in the 1970s, regulators acknowledged the
importance of adopting a standardized method of encryption for the computerized communications
of US government agencies and businesses. Recognizing the importance of public trust in
whatever algorithm was adopted, the National Bureau of Standards and National Security Agency
collaborated with an independent corporation, IBM, in the creation of the Data Encryption
Standard.
100
By the 1980s, the government sought to strictly implement a regime to regulate the export
of encrypted software. However, new actors and organizations involved in the development and
distribution of cryptographic software made the enforcement of these controls more challenging.
The examples of RSA and PGP illustrate the tensions for regulators between US government
interests in promoting commercial enterprise and protecting national security. Ultimately, they
resulted in a single dominant player in the commercialized use of cryptography in the 1990s: RSA.
Zimmermann’s PGP opened up a new realm of possibilities for hobbyist cryptographers, but its
contested legal status – combined with its complex code and RSA’s aggressive policing of its
patents – prevented widespread industry adoption.
This resulted in a divide in the applications of cryptography: legitimated commercial
applications of cryptography were largely invisible to users, marketed as ‘security’ but with little
transparency as to the inner workings of the cryptosystem. This meant that though the actual
number of people using cryptographic tools grew considerably over the 1980s and 90s, for the
most part cryptography was an invisible, embedded part of software programs and networked
infrastructures. On the other hand, PGP took on great significance for a small community of
cryptography advocates, in part because of its legal troubles: Zimmermann and his software
developed a ‘renegade’ status that built cache among a nascent cryptographic community.
Ultimately, this established the preconditions for the formulation of a new cryptographic social
imaginary.
101
A New Kind of Cipher: RSA
In 1977, Martin Gardner published a version of his regular Mathematical Games column
in Scientific American, titled “A new kind of cipher that would take millions of years to break”.
The monthly column was well-known for presenting puzzles and noteworthy findings about
mathematics for a popular audience. Citing the emerging issue of “electronic eavesdropping”, he
wrote in the column,
It is hardly surprising that in recent years a number of mathematicians have asked
themselves: Is it possible to devise a cipher that can be rapidly encoded and decoded by a
computer, can be used repeatedly without changing the key and is unbreakable by
sophisticated cryptanalysis? The surprising answer is yes. (Gardner, 1977).
Gardner outlined the basic principles of public key cryptography as outlined in the Diffie-Hellman
paper, highlighting a second breakthrough: an implementation of public key cryptography using
prime numbers that had been developed by three computer scientists at the Massachusetts Institute
of Technology, Ronald L. Rivest, Adi Shamir, and Leonard Adleman.
Rivest came across the Diffie-Hellman paper after one of his students said they thought he
would find it interesting. He felt the paper laid out compelling theoretical questions and useful
applications, but it hadn’t gone as far as explaining how to implement a public key system. For
example, he observed that you might use it to contact your stockbroker privately with their public
key. This was a notable, if unintentional, amendment of the anecdote outlined by Diffie and
Hellman in the paper, who presented a very different scenario: that public key encryption could be
used to protect against a dishonest stockbroker buying and selling for personal gain by forging
orders from clients. He brought it to his colleagues, Shamir and Adleman, and they worked
102
together on assessing what a secure implementation of the Diffie-Hellmann system would look
like. They reached out to Gardner because they needed help in researching factoring, which was a
relatively arcane specialty in mathematics at the time. Gardner, excited about the project, wrote a
column about the team’s work – the first publication of the paper outlining the algorithm that came
to be known as RSA, after its authors initials (Rivest, 2016).
Rivest, Shamir and Adleman offered a free copy of the full version of the paper, “A Method
for Obtaining Digital Signatures and Public-Key Cryptosystems”, to any reader who sent a self-
addressed, stamped envelope to their address, anticipating that they would send 400 copies.
Gardner’s column gave them more attention than they expected, and they ended up sending out
over 4,000. Before they could distribute their work, however, they ran in to a familiar problem.
They were told (by whom is unclear) that distributing copies of the paper could constitute a
violation of the International Traffic in Arms Regulation or other US laws.
22
After obtaining a
sign-off from MIT’s lawyers, they mailed out copies of the article, offering a $100.00 reward to
anyone who could come up with a weakness in the scheme (Rivest, Shamir & Adleman, 1978).
MIT decided to apply for a patent on the authors’ behalf at the end of 1977. The patenting
process took some time because of potential interference with Stanford’s patents on the Diffie-
Hellman system.
23
By 1983, the conflict was resolved, at least temporarily, and Rivest, Shamir and
Adleman decided to set up a company, RSA Data Security, to explore potential commercial
applications.
22
Rivest’s language is somewhat unclear about who they were communicating with; it is possible they
received a national secrecy order like IBM did, but I could not confirm whether this was the case.
23
Thus far I’ve been unable to find any more details about what this specifically entailed.
103
The company was not immediately successful. They first sought to create an RSA chip that
could be used for secure phone systems – but had difficulties with the supplier of the chips they
wanted to make (Rivest, 2016). They turned instead to software, a market undergoing a period of
growth as general purpose computers became increasingly popular – but again, ran into challenges.
They found a general lack of interest in distributed security in a non-military context, and for those
who were interested, processing power was too slow for computerized encryption to be very useful
(Rivest, 2016).
By 1986, RSA Data Security was on the brink of bankruptcy, its founders able to devote
little time to keeping it afloat. Jim Bidzos was hired as president, and set out to save the company
by establishing a stronger customer base. They were saved by a contract signed with Iris Associates,
which was working on developing a software program called Lotus Notes, an e-mail and
groupware program that would be marketed to businesses. Iris Associates felt it was important to
have security protections in the program, but didn’t realize that RSA was patented. RSADSI helped
develop a software toolkit that would enable them to implement RSA, which led to its first
successful contract with Lotus for an RSA software license in July 1986 (Bidzos, 2004).
RSADSI had finally struck on a business model that worked: licensing software libraries
that implemented the RSA algorithm. Bidzos closed deals with Motorola to use RSA in its
commercial secure telephones and with DEC to develop a secure network system, a project that
was eventually killed because it couldn’t be exported (Garfinkel, 1995). By 1994, over 100
companies incorporated RSADSI’s toolkit in their products (Garfinkel, 1995), and the company
sold over 4 million copies of its software with the support of clients like Apple, AT&T, Lotus,
104
Microsoft, Motorola, Northern Telecom, Novell, and Sun Microsystems (Markoff, 1994a). The
hardware business largely fell off the company’s radar.
24
RSADSI’s toolkit approach to securing software sought to minimize the complexity of its
cryptosystem. As Rivest described it:
You want something that’s sort of simple, you know, little lock icon that goes on. User
interfaces for security is an area which I haven’t studied deeply, but there’s a lot of
surprising things that happen there where people just don’t pay attention to things. They
click on banners that come up just to make them go away without reading them and so on
and so forth. So I think it’s designing a security system that minimizes the amount of
complexity, the amount of understanding that a person needs to have and does succeed at
getting their attention when it’s important to get their attention. (Rivest, 2016, 44).
This meant that its encryption was often embedded in software and hardware in ways that the end
users would not necessarily be conscious of – and perhaps they didn’t need to be. Unless the RSA
algorithm was incorporated into the marketing strategy for a software product, then, many end
users of RSADSI products were unaware of whether they were using the algorithm, or whether
they were using cryptographic software at all.
Bidzos viewed the company’s customers as fitting into three categories: users who
benefitted from the security of its software and were aware of it, like those using Lotus Notes;
users who don’t understand that it’s there, but don’t need to in order to benefit from it, like those
using Novell, and users who software publishers wouldn’t want to be aware they were using
24
During this time period Rupert Murdoch presented Bidzos with an acquisition offer seeking RSA
technology to develop secure smart cards for the SkyTV system in Europe. The deal fell through when
they couldn’t agree on a price – Bidzos felt that Murdoch drastically undervalued the worth of RSA
technology.
105
encryption software. Many of the software publishers who were RSADSI’s direct clients used
cryptographic copyright protection schemes to prevent the copying of their software and didn’t
want their customers to know these protections were in place. For example, Atari used a public
key cryptosystem to authenticate that its game cartridges were signed by the company’s public key
to prevent counterfeiting (Garfinkel, 1995).
The second category (users who didn’t need to know about encryption but weren’t
prevented from it) proved to be the most important for RSADSI, and the company saw a significant
increase in its user base when RSA was incorporated into a security protocol designed by a team
of engineers working on the Mosaic browser, Secure Sockets Layer (SSL). Led by Taher Elgamel,
the engineers’ objective was to implement encryption into the connections between a web page
and a browser, creating a secure layer for the transport of sensitive data. This was a critical function
for Mosaic, later re-named Netscape Communications, to achieve its aim to be the ‘premier
provider of open software to enable people and companies to exchange information and conduct
commerce’ on the internet: encrypting and authenticating the transmission of credit card data was
critical for people – and more importantly vendors–to put their trust in online payment systems.
Mosaic foregrounded the security of its products as a key differentiating factor in many of the
company’s early marketing materials (Mosaic, n.d.), but this became a less prominent aspect of its
product promotion in subsequent versions of Netscape (Netscape, 1995). By then, RSA encryption
was a critical, but largely invisible, part of internet infrastructure designed to make e-commerce
possible.
Though the remarkably successful browser meant a significant increase in the number of
people who were using RSA’s encryption algorithm on a day to day basis, it also meant that many
of those end users were unconscious of the significance of SSL or how the encryption worked.
106
Importantly, they were also unaware of which transactions were not encrypted – though SSL
protected their communications with web servers, averting eavesdropping by third parties, it did
not mean that their data would be deleted from or stored safely on those servers. This took on
greater significance in the following years: as Edward Snowden illustrated a decade later, users’
data was not kept private from the companies who owned those servers or anyone who demanded
access to them, and left vulnerable to surveillance by a range of actors.
Government Regulation and Export Controls
The growth in the number of data transmissions encrypted using independently developed
RSA, rather than government-approved DES, motivated the US government to increase its
regulatory oversight of cryptographic software. Though it experimented with a number of
instruments, the primary legal regime it used was export controls. US export control policy
classifies software with encryption capabilities as “munitions” requiring an export license under
the Arms Export Control Act and the regulatory regime implementing it, International Traffic in
Arms Regulations (ITAR). The Act, which passed in 1976, reflects the Cold War-era position that
cryptography is a critical national security resource.
25
This law provided the basis for the threats
leveraged against Hellman and his graduate students when they sought to present their findings on
public key encryption at the IEEE’s Information Theory symposium in 1976 – a panel on which
Rivest, Shamir and Adleman also planned to present their implementation, though they have never
25
The AECA replaced a prior law, the Export Administration Act of 1969, which codified and made
changes to the US export control system – shifting oversight from the Executive Branch to Congress, and
creating a US Munitions list of items deemed “inherently military in character”. The Munitions List under
EAR also included cryptography (Dam & Lin, 1996).
107
said anything publicly about whether they, like Hellman, were concerned about the legal
implications of doing so.
Export controls also presented a considerable challenge to software makers who relied on
international sales. Under the system, software publishers labored to create new software products
unsure of whether they would ultimately be approved for export by the government. The
uncertainty of the system harmed the publishers and RSA both: a major deal with Digital Equipmet
Corporation (DEC) to implement RSA in a multilevel secure operating system went south when
government officials warned DEC that they would not be able to market the operating system
outside of the US (Bidzos, 2004). But in July 1992, the Software Publishers Association reached
an agreement with the Bush Administration approving an expedited seven-day review process for
software that implemented RSA encryption algorithms (Rosenthal, 1993). As long as the software
used RSA’s RC2 or RC4 ciphers and limited the key bit length to 40 bits
26
, software would be
guaranteed fast track review and could be exported for international users (Garfinkel, 1995, 137).
The deal worked out nicely for RSADSI, solidifying its position as an industry standard in
software publishing. But the process of reaching the deal illustrates the complicated tangle of legal
and commercial incentives that shaped RSADSI’s position in the market for security software
during the 1990s.
26
Limiting the key bit length reduces the amount of computing time to crack an encrypted message using
a ‘brute force’ approach in which a computer program tries every possible combination. As computing
power increases, so does the speed at which encrypted messages can be broken: this is why quantum
computing systems, which would be exponentially more powerful, are expected to make all public key
cryptographic systems obsolete.
108
Cryptographic Standards
In its decision to endorse the Data Encryption Standard (DES), the government
acknowledged the importance of cryptographic protections to securing data for the nascent
computing industry. DES – and a stronger implementation known as “triple DES”, in which data
is encrypted using DES a total of three times with three distinct 56-bit keys – had become the
standard for banks and financial institutions (Rubenstein & Hintze, 2000).
But though government agencies sought to encourage the commercial viability of the
technology industry, it remained uncomfortable with the independent development and use of
cryptography. In 1980, following the conflicts over the International Symposium on Information
Theory, the NSA reached out to the American Council on Education requesting it convene a
voluntary study group on public cryptography research, which included Hellman as well as
representatives from the computer sciences and mathematics departments of six leading US
universities, as well as Daniel C. Schwartz, the NSA’s General Counsel.
The group agreed to test out a voluntary system of prior review so as to avoid the
implementation of compulsory regulations – a system that would seek to reconcile First
Amendment rights, NSA’s concern for protection of US communications security and intelligence
gathering capabilities, and the growing importance of privacy protections for individuals and
corporations (Public Cryptography Study Group, 1981). But it did not lift the onerous regulations
placed on applied cryptography: cryptographic software built for export still had to undergo review.
This status quo seemed poised to change in 1990, when the Bush Administration issued a
directive that items be removed from the munitions list unless significant US national security
interests would be jeopardized. The software industry lobbied for cryptography to be considered
109
in this review, creating an exception for all “mass market” software by giving a limited range of
uses of encryption in software products decontrolled status – essentially, excluding them from
regulatory oversight under the munitions controls. Their hopes were shut down when the State
Department proposed a rule that would codify the status quo by allowing for software products to
be considered on a case by case basis. The uncertainty of the existing munitions control system
would be maintained.
Software publishers then turned to Congress, arguing that the regulation harmed US
business interests because cryptographic products were already available abroad, and put US
software publishers at a competitive disadvantage compared to other international software
companies. They found support within the House Foreign Affairs Committee, which proposed a
bill that would transfer jurisdiction over mass market software programs, including those that used
cryptography, to the Department of Commerce. But the bill received strong opposition by the Bush
Administration, which promised to veto any legislation that contained such a measure.
Finally, the Software Publishers Association met with White House officials to assess
whether their concerns could be met within the existing export controls framework. They struck a
deal to ensure that software publishers could develop their products with a degree of certainty that
they would be approved for export: as long as they used two RSA algorithms, and implemented
them with key sizes less than 40 bits, they would receive fast-track review and be guaranteed
approval (Rubinstein & Hintze, 2000). The software publishers were not entirely happy with the
agreement – it meant hobbling the security of their products, which impeded their competitiveness
on the international market – but it at least meant some reduction of the regulatory burden
(Rosenthal, 1993).
110
The Escrowed Encryption Standard
The deal was overshadowed when the Clinton Administration introduced what it called the
Escrowed Encryption Standard (EES) in 1993. EES was a new stage in a policy debate that had
begun in earnest a few years earlier. The FBI was concerned that its wiretapping powers,
authorized under the Omnibus Crime Control Act of 1968 and used by law enforcement agencies
to prosecute organized crime, were waning in the face of increasingly secure communications
equipment. Pushing Congress to act, in 1991, the Senate introduced an omnibus crime bill, the
Comprehensive Counter-Terrorism Act of 1991 (S.266), which contained language that required
electronic communication service operators and equipment manufacturers to “ensure that
communications systems permit the government to obtain the plain text contents of voice, data and
other communications when appropriately authorized by law” (Bidgoli, 2006).
RSADSI, Computer Professionals for Social Responsibility, and the newly-formed
Electronic Frontier Foundation launched an industry-wide offensive against the language in the
bill (Garfinkel, 1995, Van der Leun, Godwin & Kapor, 1991). After the groups met with Senator
Patrick Leahy, who chaired the Senate Judiciary Committee’s sub-committee on Technology and
the Law, the encryption clause was eliminated from the bill (EFF, 1991). Another bill, the Digital
Telephone and Communications Privacy Act, took a slightly different approach, targeting the
design of telecommunications systems themselves rather than their operators. The Act would
require that new communications technologies be designed in order to facilitate government
wiretapping, and followed an attempt by the FBI to convince the telephone industry to go along
with this plan voluntarily.
The bill also failed in Congress following opposition by CPSR and EFF
in 1992.
111
The Escrowed Encryption Standard took a non-legislative approach to achieving a similar
goal: it would replace the Data Encryption Standard with a new standard that would put the
decryption key in escrow, so that government agencies could gain access to wiretapped encrypted
phone conversations or computer data by submitting a legal request (Froomkin, 1995). As Steven
Levy described it in the 1994 article in New York Magazine, “Battle of the Clipper Chip”, the
government’s objective was not to eradicate strong cryptography, but to prevent it from becoming
routine. “by making Clipper the standard, the Government is betting that only a tiny percentage of
users would use other encryption or try to defeat the Clipper”, he said (Levy, 1994).
This was, in some considerable respects, a step back from the government’s position two
decades earlier when it announced plans for the Data Encryption Standard: despite NSA’s
involvement in approving its development, regulators deemed DES strong enough that it was
subject to export controls, leading some to speculate that it regretted creating the standard
(Froomkin, 1995). The National Institute for Science and Technology announced its plans for the
replacement in April 1993. Chipmaker Mykotronx would develop the chip, which would be called
MYK-78T, colloquially known as the Clipper Chip. AT&T also joined with a plan to launch a new
line of phone security devices that could be attached to telephone lines to encrypt conversations
using the new standard.
In Levy’s view, it was in fact AT&T that opened up the opportunity for NSA to develop
the standard. In a conversation reminiscent of that between IBM and NSA in the 1970s, AT&T
first came to NSA with plans to sell a line of secure phone devices called the “Surity 3600” which
would use the nonexportable version of DES. NSA suggested AT&T instead try using a computer
chip it had developed called Capstone, which was a cryptosystem implemented with a backdoor.
112
In return, AT&T received a guarantee that the FBI would buy 9,000 of the devices, and would
receive favorable consideration for export (Levy, 1994).
Other actors were less thrilled: it was a move away from what New York Times reporter
John Markoff described as the “emerging de facto standard for computer security”, RSA. Given
the considerable barriers posed by export controls, the voluntary standard had potential to strongly
influence the market away from RSA. Bidzos responded by saying "The computer industry has
moved on its own. That movement hasn't been viewed favorably by certain Government agencies
and this is their attempt to force those industries down the path that the Government wants"
(Markoff, 1993).
This proved to be an opportunity for RSADSI to cultivate its reputation by rallying the
nascent data security community in opposition to government regulations. The company had begun
hosting annual conferences in 1991, the first of which focused solely on the National Institute of
Technology and Standards’ plan to issue a Digital Signature Standard, which the company was
concerned would likely undermine RSADSI’s position in the market (Greene, 2011). Subsequent
conferences grew in scope to address changes in public policy, announce new products that
incorporated RSA technologies, and discuss new innovations in the field (Dyson, personal
communication, 1994).
RSADSI was active in its opposition to the Clipper Chip, launching a ‘Sink Clipper!’
advertising campaign that encouraged its clients to boycott the Clipper devices and the companies
making them exclusively, to write their representatives in Washington, and to support vendors
selling products using ‘real’ RSA encryption technology. No longer content for RSA to be
invisibly incorporated in its software, the company exhorted users to:
113
Find out exactly what kind of security you are getting in your next e-mail, e-forms, cellular
devices, operating system or remote access software purchase. Whenever you shop for
software that makes claims about security, be sure to ask the sales rep about which
algorithms are used inside. Demand the very best in encryption: insist on RSA (RSA, 1994).
Figure 3.1: “Sink Clipper” advertisement, Source: RSA Data Security International, 1994.
The company had an incentive in stoking these policy debates: its success, at least in part,
relied on lingering suspicions around the security of DES and the possibility that the NSA had
interfered to weaken the standard. Bidzos, in particular, cultivated a reputation as a bulldog who
114
fought against government oversight.
27
As he put it in one news article, "For almost 10 years, I've
been going toe to toe with these people at Fort Meade. The success of this company is the worst
thing that can happen to them. To them, we're the real enemy, we're the real target." (Levy, 1994).
The company was joined by a diverse collective of compatriots, including Computer
Professionals for Social Responsibility, the Electronic Frontier Foundation, the Industry Coalition
on Technology Transfer, the US Council for International Business, and a collective of amateur
cryptographers who called themselves cypherpunks. The group successfully rallied opposition
during the 60-day public comment period for the proposal; NIST received 320 comments, only
two of which favored the standard (Levy, 1994).
In 1994, Congress again considered the Digital Telephony Act, this time passing the bill –
but this time, it contained a carve-out excluding “information services”, including the Internet and
online networks (EFF, 1994). In the same year, AT&T researcher Matthew Blaze published a paper
uncovering a design flaw in the Law Enforcement Access Field of the Clipper Chip that would
enable anyone to tinker in it to communicate in privacy with anyone else who had a Clipper phone
– so despite the key escrow system, it was possible to use the Clipper Chip to communicate over
an encrypted channel that law enforcement would have no access to (Blaze, 1994). The purpose
of the chip moot, and its original motivations already largely addressed through legislation, the
Clipper Chip never reached widespread adoption. This concluded the first salvos of what would
come to be known as the Crypto Wars.
27
RSADSI’s relationship with the NSA evolved in the mid-2000s (after Bidzos’ departure) to become
much more cooperative. The Snowden revelations unveiled a secret $10 million contract signed between
RSADSI and the NSA creating a back door in the random number generator in RSADSI’s BSafe
software. The sum represents over a third of RSADSI’s revenue for the entire previous year (Menn,
2013).
115
The example of RSA illustrates several key elements: first, it explains how the day-to-day
use of encryption by end users grew over the course of the 1980s and 90s, though this spread was
largely unbeknownst to those users – many of them did not know or care about the use of
encryption, or understood it as ‘security’ for software. At the same time, this growth in use of
encryption was viewed as a threat by government regulators, who attempted a variety of tactics to
curb its use: export controls, direct regulation and proposing a new encryption standard. Of these
three, export controls proved the most successful form of control. Lastly, it shows how these two
factors – market demands and regulatory oversight - combined to make RSA a de facto standard
for the software industry.
Pretty Good Privacy: A Non-Commercial Approach to Encryption Software
“It's personal. It's private. And it's no one's business but yours,” Phil Zimmermann wrote
in the 1991 guide for PGP e-mail encryption software:
You may be planning a political campaign, discussing your taxes, or having a secret
romance. Or you may be communicating with a political dissident in a repressive country.
Whatever it is, you don't want your private electronic mail (email) or confidential
documents read by anyone else. There's nothing wrong with asserting your privacy. Privacy
is as apple-pie as the Constitution (Zimmermann, 1991).
Over the course of the early 1990s, while RSA was emerging as the gold standard for the
technology industry, an outside competitor materialized that took a similarly adversarial approach
to government, but without commercializing the software or accommodating US export control
laws. Not quite proprietary, not quite usable and, as it turned out, not quite secure, tracing the
116
history of PGP offers a useful alternative perspective on cryptographic software development in
the 1980s and 1990s.
The kernel of the idea for PGP grew out of a relationship between its creator, Phil
Zimmermann, and Charlie Merritt, who ran a small company called Merritt Software. Merritt sold
security software implementing RSA to customers who were, primarily, commercial entities with
offices in nations with autocratic governments who wanted to assure privacy from their local
competitors. Zimmermann decided he wanted to develop an implementation of RSA for the IBM
PC, and asked Merritt to fly out to meet him in Boulder Colorado to teach him what he knew.
Merritt also made plans to meet Jim Bidzos, who had just become the company’s president
(Garfinkel, 1995).
Zimmermann was deeply concerned about the possibility of nuclear war and the advent of
the military-industrial complex.
28
He was particularly anxious that electronic mail would make
possible an Orwellian system of surveillance – he feared that while steaming open paper letters to
read them individually was time and labor-intensive, intercepting and scanning for keywords in e-
mails could “be done easily, automatically, and undetectably on a grand scale”, much like driftnet
fishing (Zimmermann, 1991). Zimmerman espoused a particularly liberal set of politics –
distrusting of authority, but expressing faith in democratic principles and the need to protect civil
liberties (Coleman, 2008). He regarded privacy as “apple-pie as the Constitution,” and set out to
create technologies to protect it (Zimmermann, 1991).
28
These concerns were shared by Martin Hellman, and fighting nuclear proliferation became a prominent
theme in his work from the 1980s to the present day. Hellman became involved with the Beyond War
movement, which was a collective of Silicon Valley people who fought to change societal thinking about
nuclear weapons and war. Hellman sees this work as highly consonant with his research on encryption: as
he said in an article in Stanford Magazine, both are about “radically new ways of communicating”
(Myers, 2016).
117
He settled on building a system for encrypted e-mail, anticipating that:
We are moving toward a future when the nation will be crisscrossed with high capacity
fiber optic data networks linking together all our increasingly ubiquitous personal
computers. E-mail will be the norm for everyone, not the novelty it is today. The
Government will protect our E-mail with Government-designed encryption protocols.
Probably most people will acquiesce to that. But perhaps some people will prefer their own
protective measures. (Zimmermann, 1991).
He named the system Pretty Good Privacy, a reference to Garrison Keillor’s radio broadcast Prairie
Home Companion.
By 1991, Zimmermann had the core structure of his program down, and decided he wanted
to distribute it as shareware, which would allow it to be freely copied but would require users to
send a check to the author in order to get all components to work. He wrote to Bidzos, who he’d
met at a dinner with Merritt in Boulder some years before, asking for a royalty-free license for a
project that would do RSA and DES encryption on a PC. Acknowledging an obvious similarity to
existing RSA software, in the letter he said “This would be somewhat analogous to your MailSafe
or ComSafe products. I guess it would sort of compete with these products. I suspect these products
are not the backbone of your company’s cash flow, anyway. I just think they would be fun for me
to do” (Garfinkel, 1995, 97).
29
29
Simson Garfinkel’s book PGP: Pretty Good Privacy provides the most detailed account of the
interactions between Bidzos and Zimmermann I was able to find. The book is a widely recommended
primer on the software, but is unclear as to whose version of the events it describes: as some of the
phrasing suggests it may be based on Bidzos’ account, its portrayal of Zimmerman’s lackadaisical view
toward RSA patents should be read with some skepticism.
118
Bidzos adamantly refused, denying that he had ever offered Zimmermann a free license
and saying he would never do so, as this could mean forfeiting the right to RSADSI’s patents. The
company was concerned that offering a free license to one potential client could be interpreted by
others as a reason not to pay royalties. This made the shareware version of the software impossible.
Before Zimmermann could pursue other avenues, the Senate announced a new Omnibus Crime
Bill, S. 266, that would have important implications for PGP. The bill included language
mandating that communications providers have the capacity to provide the plaintext of any
communications encrypted on their software or devices–a program like PGP that made this
impossible would be in clear violation.
According to Garfinkel, Zimmermann was deeply concerned about the bill: he was already
close to losing his house, and was concerned he would miss a big payoff from the release of PGP.
30
He removed DES from the program, replacing it with a new cryptosystem he called Bass-O-Matic,
and released the program, calling it “RSA public key cryptography for the masses”. Zimmermann
claimed copyright on the software, but provided the source code for free under the Gnu Public
License on the assertion that the source code for RSA’s MD4 signature function and LZHUF
functions were already in the public domain – a bold claim that would soon be contested by RSA.
To release the program without violating export controls (at least, overtly), he gave a diskette
containing a copy of the program to a friend, who posted it on Usenet.
Unlike RSA, users of PGP explicitly sought to use cryptographic software – though in time,
PGP, and a variant called GnuPG, would be increasingly incorporated into other security
infrastructures, in the early years PGP was explicitly marketed as encrypted email software. It was
30
Garfinkel’s account is unclear as to how Zimmermann planned to monetize the shareware version of his
program, or whether he was paid for the ultimately freely distributed version.
119
notoriously difficult to use, requiring a considerable amount of time to configure – as I will discuss
in the next chapter, it contributed to the formation of a community of hobbyist cryptographers who
worked together to develop implementations for different kinds of personal computers. Combined
with the dubious legal status of the program, using PGP was an explicitly political act; this only
increased its appeal among a community of users who were explicit about their objectives in
accessing, configuring, implementing and, most of all, troubleshooting the software.
31
Coleman and Golub (2008) argue that Zimmermann’s decision – his motivations outlined
in the user guide to PGP – was an “act of civil disobedience” formulating a new moral genre:
individual autonomy and freedom from government interference in the register of cryptography.
Zimmermann was an outsider compared to Rivest, Shamir, Adleman, all of whom had associations
with elite academic institutions that gave them social currency that Zimmermann lacked, but made
it less likely they would adopt a strong oppositional stance. Though Bidzos sought to cultivate a
business reputation for challenging government regulation, this fit comfortably within the
technology industry’s general opposition to government oversight and a culture of elite liberalism.
By contrast, Zimmermann had along track record of opposition to US military endeavors
through his anti-nuclear activism. His work on technology was deeply tied to his ethical
sensibilities; the idea that individual autonomy, self-reliance and self-control can be applied to the
world of digital information and enacted through code. Zimmermann also departs from more
traditional liberal commitments in one, important, respect: the protection of property. Coleman
31
Zimmermann himself has said he finds PGP challenging to use, and no longer runs the program on any
of his devices (Franceschi-Biccherai, 2015). The software’s notorious difficulty led to the development of
an entire genre of academic articles on usability devoted to studying the challenge of communicating
security concepts to PGP users: the first was titled “Why Johnny Can’t Encrypt”, and future articles riffed
on this theme with titles such as “Why Johnny (Still) Can’t Encrypt”, “Why (Special Agent) Johnny
(Still) Can’t Encrypt”, “Confused Johnny” and “Why King George III Can Encrypt”.
120
and Golub argue there are many distinct and sometimes conflicting moral genres among hacker
communities, labeling Zimmerman’s variant crypto-freedom. I expand on this ideology, which I
describe as crypto-anarchy, in the next chapter: but for the purposes of this chapter, I focus on the
consequences of the decision for Zimmerman and for PGP.
Patent Battles
I found no evidence that Zimmermann engaged directly with the free and open source
software communities while developing PGP, but the battle over its release would have important
implications for overlaps between the F/OSS and cypherpunk communities in years to come.
Zimmermann’s release of the software – possibly in violation of export control laws – set off a
patent battle that made RSA and the US government strange bedfellows. Bidzos was furious, but
sent Zimmermann a letter saying that if he stopped distributing PGP he would not sue him for
willful patent infringement, which Zimmermann signed. He asserts that he did not distribute any
copies of the software after signing the letter (Garfinkel, 1995), but versions of PGP nevertheless
appeared on ftp sites in Europe and Australia that leaked back into the US (Levy, 1994b).
Even though Zimmermann no longer distributed the software, it continued to circulate, and
RSA – through an entity known as Public Key Partners, a sublicensee of its patent that also held
the patents for Diffie-Hellman – began to go after universities, the EFF, CompuServe and AOL to
erase copies of PGP from their systems. Given RSA’s prominence in the industry, these threats
were taken seriously by the lawyers of technology companies: for example, DEC’s lawyers
determined that the company could be charged with “aiding and abetting an illegal export” for
hosting a message posted by one of its employees explaining where to access PGP. The company
121
issued an advisory notice to its system and network managers including a list of keywords that
they could use to identify and remove encryption software from its software repositories, including
“cipher”, “watchdog”, and “pgp”. Moreover, the company asserted, any employees that were even
aware of a proposed transfer of an item to another country that would violate ITAR would have to
send a notice to the company’s Worldwide Trade Office explaining the details (Conklin, personal
communication, 1994). This impacted adoption of PGP in the corporate sector, even after
Zimmermann negotiated a deal to offer PGP legally to corporate users for $100 through a company
that already had an RSA license, Viacrypt (Bidgoli, 2006).
Though this version of the software was unpopular, the free version spread rapidly online,
particularly in amateur and hobbyist communities that RSA did not reach. PGP became an
exemplar of the maxim coined by John Gilmore, “"The Net interprets censorship as damage and
routes around it." (Elmer-DeWitt, 1993). It was available through a site linked to MIT and a forum
on CompuServe warning visitors “IF YOU ARE NOT A CITIZEN OF THE UNITED STATES,
DO NOT DOWNLOAD THIS FILE” (Sussman, 1995). The continued spread of PGP abroad
suggests not all users obeyed this warning. In 1995, Zimmermann published the source code for
PGP in an ingenious, and legal, manner – in a hard copy book published by MIT Press, which used
machine readable fonts that could be scanned using optical recognition software (Bidgoli, 2006).
122
Figure 3.2: “PGP: Source Code and Internals”. Source: Philip R. Zimmermann, MIT Press, 1995.
In the introduction to the book, Zimmermann wrote:
PGP is free software. Anyone may download it on the Internet, or from many Bulletin
Board Systems. It has stirred up some controversy, because it has become a worldwide de
facto standard for Email encryption, despite US export restrictions. Initially published in
the US, this package has spread by the diffusion that is common to free software packages,
with its "forbidden" flavor giving it an extra popularity kick. Oddly enough, the US
123
Government may have inadvertently contributed to PGP's spread, by making it more
popular because of my case. (Zimmermann, 1995).
By 1994, Zimmermann’s struggles over RSA’s patents formally concluded – that year,
RSA released a free, noncommercial version of its algorithm called RSAREF. It was designed to
act as a “portable, educational, reference implementation” of cryptography – freeware that could
help individuals implement a new internet standard called Privacy Enhanced Mail (RSA, 1994;
Levy, 1994b). Once this version was made available, Zimmermann replaced the section of
software that violated RSA’s patents with RSAREF, and rereleased it as PGP version 2.5. This
version was still backwards compatible with prior, violating versions of PGP, which Bidzos took
issue with. He compelled Zimmermann to release one more version, PGP version 2.6, which would
no longer decrypt files produced using the previous versions of PGP.
The Zimmermann Case
Zimmermann’s troubles continued for several more years, but arguably only further
solidified the community of advocates focused on the protection of the ability to use encryption.
In 1993, a grand jury out of San Jose began to collect evidence to assess whether to indict
Zimmermann for violating export controls (Sussmann, 1995). The patent battles only fueled the
probe, creating additional grounds on which Zimmermann could be investigated. He was detained
and interviewed by US Customs officials at Washington Dulles Airport in 1994, who were
concerned that PGP was stolen property being shipped overseas, because it contained RSA’s
patented software (Garfinkel, 1995, Diffie & Landau, 2000). A coalition of emerging civil liberties
groups organizing around the protection of privacy on the internet joined to provide Zimmermann
124
legal support to continue to fight back: the Electronic Privacy Information Center, Electronic
Frontier Foundation, Computer Professionals for Social Responsibility and the American Civil
Liberties Union all provided support to Zimmerman’s team of defense lawyers. Internet users also
joined together to write letters to politicians and provide financial support for Zimmerman’s legal
defense fund (Gimon, 1995).
In 1996, the government adopted a new tactic that it would use in subsequent legal battles:
declining to prosecute the case without providing information as to why, thus eliminating the
oxygen fueling the groups that rallied in opposition to it. As the Assistant US Attorney in the case
put it, “This was an interesting case that has raised a lot of interesting issues from both a legal and
technical point of view. Sometimes the right thing to do is nothing” (Markoff, 1996a).
Zimmermann’s lawyer, Phil Dubois, speculated about the reasons in more detail:
One perfectly good reason might be that Mr. Zimmermann did not break the law… Another
might be that the government did not want to risk a judicial finding that posting
cryptographic software on a site in the U.S., even if it's an Internet site, is not an "export".
There was also the risk that the export-control law would be declared unconstitutional.
Perhaps the government did not want to get into a public argument about some important
policy issues: should it be illegal to export cryptographic software? Should U.S. citizens
have access to technology that permits private communication? And ultimately, do U.S.
citizens have the right to communicate in absolute privacy? (Dubois, personal
communication, 1996).
125
Bernstein vs. the U.S. Department of Justice
These questions would be answered definitively through another legal case, this time
brought by a graduate student in computer science against the US government. Daniel Bernstein,
a graduate student at the University of California, Berkeley, sued the government for violating his
first amendment rights through the export controls regime. Bernstein developed a new encryption
algorithm called Snuffle. Snuffle converted one-way hash functions, which were allowed by the
government under the export control system, to encryption systems (Bernstein, 1997). Under the
law, in order to publish his work or to present it at conferences, Bernstein would be required to
submit his research for government review, to register as an arms dealer, and apply for and obtain
a license to publish his work.
Bernstein was explicitly interested in challenging the export control system, which he
described as “silly” in his declaration before the court. He was particularly interested in a supposed
“public domain” exception to the law, and first explored whether leaking cryptographic
information to the press would prevent someone from prosecution. He found that the exception
was essentially a catch-22: “nothing can be in the public domain unless it has already been
published, and nothing can be legally published unless it is in the public domain” (Bernstein, 1997).
Bernstein was supported by the Electronic Frontier Foundation, which sued the US
government on his behalf, asserting that software code constituted expressive speech and thus
deserved first amendment protections (Dame-Boyle, 2015). The court ruled that the export controls
regime did indeed violate Bernstein’s rights. In the initial ruling, Judge Marilyn Hall Patel made
the groundbreaking assertion that code is a form of speech:
126
This court can find no meaningful difference between computer language, particularly
high-level languages as defined above, and German or French....Like music and
mathematical equations, computer language is just that, language, and it communicates
information either to a computer or to those who can read it (Bernstein I, 922 F. Supp. 1426
(N.D. Cal. 1996))
A subsequent ruling found that the export rules on encryption acted as an unconstitutional prior
restraint on speech.
Finally, in a third ruling, the court reiterated that the export controls were an
unconstitutional prior restraint on speech despite changes to the regulations shifting the agency
responsible from the Department of State to Department of Commerce. As Judge Patel put it,
By the very terms of the encryption regulations, the most common expressive activities of
scholars — teaching a class, publishing their ideas, speaking at conferences, or writing to
colleagues over the Internet — are subject to a prior restraint by the export controls (Flynn,
1997).
Patel’s ruling asserted that export controls regulated cryptographic software because of its
content, not solely its function (Markoff, 1996b). However, the court decided to limit the
implications of its ruling by stating that though it could issue a nationwide injunction against the
enforcement of encryption restrictions against anyone, it declined to do this out of concerns that
“the legal questions at issue are novel, complex and of public importance” and an appeal from the
government was likely (Flynn, 1997).
Law enforcement officials took a similar tactic as in the Zimmermann case: the injunction
granted to Bernstein was stayed in a decision by the Ninth Circuit Court of Appeals in 1999, but
127
then the government asked for en-banc review of the case, which would mean a review by eleven
judges instead of three. This meant that the panel decision was withdrawn, but before the en-banc
review could occur, the government made changes to its regulations that necessitated sending the
case back to the District Court. As oral arguments began on the case, the government backed away
– a move that prevented a final decision on the case (Bernstein, n.d.). The Crypto Wars – at least,
as conventionally understood – were over, without a culminating battle or decisive victory.
The example of PGP offers a different lens on the development of cryptographic software
in the private sector. While the RSA example shows how regulators can channel software
development around a preferred standard, PGP demonstrates that regulations can backfire: the
legal strategies adopted by government officials fueled interest in the software, which, despite its
poor usability, attracted users in the free and open source software community and those concerned
about protecting privacy. Moreover, the legal battles Zimmermann encountered helped further
facilitate the joining together of a coalition of civil liberties groups interested in the protection of
privacy online. Ultimately, PGP became a nexus point for a nascent community of cryptography
enthusiasts interested in sharing and experimenting with cryptographic software.
Conclusion
This chapter examines how commercial and regulatory influences combined to shape the
use of cryptography in the 1980s and 1990s. Through the examples of RSA and PGP, I’ve shown
two very different possible directions for the incorporation of cryptography into software. In
RSA’s case, encryption became invisible and embedded in software in ways that users were
unlikely to be conscious of. This choice in the implementation of RSA had important downstream
128
consequences: its invisibility to users masked that encryption was incorporated into internet
infrastructure in ways that encouraged commercial transactions but masked the absence of privacy
protections. Moreover, RSA’s algorithms became standards for securing software in the 80s and
90s, the result of preferences by other software firms and government regulators’ approval.
This meant that RSA algorithms began to take on properties of infrastructure:
embeddedness, transparency, invisibility, standardization and links with conventions of practice
(Star, 1999). Though I provided a more substantial review of the literature on infrastructure in the
introduction, it may be worth reinforcing here that infrastructure is, at its core, relational; it
embodies hard technical materiality and is shaped through social processes (Star, 1999; Star &
Ruhleder, 1999; Edwards, 2003; Leonardi, 2012). In this chapter, I’ve focused on how encryption
technologies are entangled with a particular subset of social process; how the author’s intentions,
market and law had an important shaping influence on how RSA’s cryptographic software
developed and became commercialized.
PGP was similarly influenced by the market and the law, but with a dramatically different
outcome. Where RSA software was largely invisible to users, PGP was markedly visible, its use
of encryption constantly foregrounded to users both in its marketing and the actual affordances of
the technology. Despite this, a community of users were drawn to the shared challenge of
navigating PGP’s complex code in order to take advantage of the software’s privacy protections.
The efforts of regulators to curb PGP’s spread only further motivated the formation of a
community of amateur cryptographers, granting the software a ‘renegade’ reputation that may have
in fact encouraged a certain user base to test it out. Moreover, the legal battles faced by
Zimmermann and courted by Bernstein contributed to the establishment of a coalition of privacy
129
organizations united in support of the protection of a right to privacy and the use of encrypted
software online.
As I will describe in the next chapter, as an open source alternative centered on the
protection of individual privacy, PGP went on to develop greater importance for cryptography
advocates. Zimmermann’s explicit embrace of the politics of his software, and the story of his
triumph over the legal battles, became a subject of legend within the nascent cryptographic
community. PGP users began to band together around a notion of freedom that could be achieved
through cryptography, formulating a new social imaginary akin to the kind of civil disobedience
modeled by Zimmermann. Drawing on PGP’s ability to overcome export control measures
designed to reinforce national borders, these new cryptophiles ultimately generated an elaborate
political philosophy, crypto-anarchy, premised on the notion that cryptography itself could level
the playing field, strengthening the power of institutions to overcome the sovereignty of states.
Though I will elaborate on this social imaginary in greater detail in the next chapter, I’d
first like to trace through how these accounts illustrate distinct perspectives on the dominance of
law or technology. The rapid spread of Zimmermann’s program illustrates in practice the values
of open access at work in the architecture of the internet; enabling the software to be circulated in
contravention of legal regimes. And yet, despite the assertions of individuals like John Gilmore
that ‘the Internet treats censorship as damage and routes around it’, in other respects the case of
PGP suggests that law still can be a dominant force.
In this case, it effect was not produced through clear legal decisions, but through strategic
ambiguity: at several points, regulators opted not to pursue cases, instead leaving open the
possibility of future regulation. This had its intended effect: the uncertain legal status of PGP did
prevented its use by corporations, and diminished its legitimacy in the eyes of potential users, even
130
as it gave it a kind of renegade cache for others. Moreover, though PGP software was ultimately
able to circulate free of legal constraints, Zimmermann himself did face very real personal and
financial challenges as a result of the lawsuits, the downstream effects of which on Zimmermann
and other potential cryptography entrepreneurs should not be discounted. As a result, though the
Crypto Wars are commonly understood as a victory for privacy advocates, I see the question of
which would prove the stronger force – technology or the law – as ultimately unresolved at the
conclusion of the crypto wars. This became a primary focus of the community of crypto-
enthusiasts that emerged during these years.
131
Chapter 4: Cypherpunks, Crypto-Anarchy, and the Interpretative Flexibility of Encryption
The advent of public key encryption sparked the imaginations of amateur technologists and
hackers in the 1980s and 1990s. By the early 1990s, a group of technologists coalesced around
building technologies that would reap the potential of encryption in the networked age. This group,
who called themselves the cypherpunks, proved as influential in reshaping political ideas around
encryption as they were in promoting its use; arguably their ideas carried much farther than any of
the technologies they built.
In a marked departure from many of their peers in the free software community (Coleman,
2010), the cypherpunks were overtly cognizant of how technologies like encryption can have
politics (Winner, 1980). Many, though not all, were technological utopians, or were at the least
technocrats. They believed that technologies themselves could reconstruct the distribution of
power and authority in society; indeed most of them believed they were participating in a project
that would bring forth major social and political change. Though my analysis in this chapter
focuses on the content of cypherpunk ideas, not analysis of cypherpunk code, their ideas did
manifest in technical choices, with or without the intention of their creators, in ways that are tied
to sociopolitical consequences (Winner, 1991). Thus, explicating and teasing out the nuances of
the philosophies of technologists is an initial step toward the end goal of making visible the
configurations of power and authority in the projects they create.
The nature of this political order proved important for reshaping the political meaning of
encryption after the invention of public key cryptography. If in the 1970s, socially minded
engineers expressed concerns that computers would strengthen bureaucratic control and harm
132
individual privacy, in the 1980s and 90s, many members of the computing community came to the
exact opposite conclusion: that computers and networking technologies could be a means to
achieving freedom and personal fulfillment (Turner, 2006; Streeter, 2011; Schulte, 2013).
Through discussions, debates, thought experiments and code production, the cypherpunks
constructed a new social imaginary for encryption in a networked age. This social imaginary,
which they labeled crypto-anarchy, posited that new kinds of social relations, enabled by
cryptography, have the potential to even out power discrepancies between states and individuals,
bringing into being a freer, more meritocratic, world. Moreover, crypto-anarchist thought
hypothesizes that bringing cryptographic technologies in to the world would, in and of itself,
catalyze these changes in the face of whatever legal or regulatory challenges states sought to
impose. For cypherpunks, cryptography is fundamentally about sovereignty, achieved by
individuals through technological means.
This chapter proceeds as follows: first, it interrogates the nature of that political order
through an examination of the core texts that have come to form the corpus of cypherpunk ideology,
a series of manifestos written by the group’s co-founders, Timothy C. May and Eric Hughes.
Through a close reading of the texts, I argue that there were in fact at least two distinct, not entirely
compatible interpretations of crypto-anarchy among the founders. In the second part, I trace these
interpretations through the discussions and debates within the cypherpunk collective, drawing on
the Cypherpunk Archive, a collection of the emails sent between members of the cypherpunk
listserv, considering the range of meanings and interpretations put forward by members of the
cypherpunk community in its early years through the framework of interpretative flexibility, or
how different meanings come to be associated with technological artifacts (Pinch & Bijker 1987,
Kline & Pinch, 1996).
133
Kline & Pinch describe interpretative flexibility as a means of countering an over-emphasis
in STS literature on studying the processes of technical production by considering how other
relevant social groups can reshape the meaning of technical artifacts. The cypherpunks are a unique
social group engaged with cryptography, which may best be understood as playing a mediating
role – alternately acting as users, producers, and advocates for cryptography. The cypherpunks
fused technical production with their advocacy, rather than silo these off as separate domains: their
aims are much like what Christina Dunbar-Hester describes as propagation, or the combination of
interpretive work and material engagement with an artifact (Dunbar-Hester, 2014). Despite this, I
argue that it is this interpretive work that contributed most to the lasting social significance of the
cypherpunks, outweighing the group’s claims that “cypherpunks write code”.
Cryptographic Imaginaries
In previous chapters, I have focused on encryption as a discursive artifact, tracing how
ideas about what encryption is for evolved alongside networked infrastructures. I have argued that,
prior to the 1960s, encryption was largely understood as a tool of state power and a critical national
security resource. At the end of World War II, US government officials continued stringent
classification regimes designed to perpetuate the state’s monopoly on cryptography, considering
it an important element in the nation’s security arsenal for the Cold War. With the advent of
networked communications, however, encryption began to take on new meanings. As commercial
businesses began to adopt computer technologies, encryption grew to be understood as an
important aspect of data security for emerging computing businesses. This meaning widened to
encompass privacy as well, partly in reaction to a series of government wiretapping scandals, and
134
partly due to fears by technologists that new networked and database computing technologies
could be abused.
In the 1980s and 90s, cryptography began to take on properties of infrastructure, becoming
embedded in software in ways that made it largely invisible to end-users. At the same time, a new
community of amateur cryptographers arose and began to distribute cryptographic software within
the free and open source community. This led to a series of regulatory battles in the 1990s that
established, at least for the time being, that the writing of cryptographic software was a form of
expressive speech protected by the first amendment.
In this chapter, I adopt a new framework for understanding and analyzing encryption,
shifting from a study of encryption as discourse to encryption as a social imaginary. I understand
a social imaginary to be something broader and more all-encompassing than discourse; it is, as
Charles Taylor describes it, “not a set of ideas; rather it is what enables, through making sense of,
the practices of a society” (2002, 91). Social imaginaries bridge ideas and practices, they
encompass both ways of thinking and ways of being in the world. This is a particularly powerful
concept for understanding the ideas that we elaborate around technologies, because it affords a
mode of analysis that can include both technical practice and discursive arguments (Kelty, 2005).
The cypherpunks embody what Chris Kelty has described as a recursive public, “a public
that is vitally concerned with the material and practical maintenance and modification of the
technical, legal, practical and conceptual means of its own existence as a public” (Kelty, 2008, 3).
Cypherpunks did not only philosophize about the potential usages of cryptography, cypherpunks
“wrote code” in order to bring a new political order into being. Though this chapter focuses more
on the nature of the political order they envisioned, I understand this as something more than a
“crypto-discourse” (Hellegren, 2017); it is a cryptographic imaginary that is embodied in the
135
technologies produced within the cypherpunk community and in their modes of community
governance, the tools they developed to commune and relate to one another and to create change
in the wider world.
The Cryptographic Underground
The cypherpunks coalesced as a group in the early 1990s, just as the Internet was
undergoing a process of commercialization that widely expanded its use among non-experts. Many,
if not most, of the cypherpunks were computing specialists: though some who joined the group
were amateurs, most of them were software engineers who were enrolled in, graduated from, or
worked at the computer science departments of major universities, were employed by large
technology companies, or participated actively in free software projects. Some participants were
notorious hackers, while others were key figures in early online communities such as FidoNet and
the phone phreaks. The cypherpunk community emerged at the nexus of several distinctive
computing communities of the 1980s and 90s, exhibiting the strains of communitarian and
libertarian cyberculture endemic to the period with a particularly political bent. What brought them
together was a shared interest in realizing the potentials of cryptography for a networked age.
Many of the ideas that inspired the cypherpunks in fact came out of the work of academic
cryptographers in the 1980s (Narayanan, 2013; Narayanan, 2017; Hellegren, 2017). As Rivest,
Shamir and Adleman worked to develop and commercialize implementations of public key
cryptography on the east coast, on the west coast researchers like David Chaum sought to formulate
new ideas for what cryptography could be used for. Over the course of the 1980s, Chaum published
influential articles outlining thought experiments for potential uses of cryptography: digital cash,
136
which could be used to transact online securely and anonymously (Chaum, 1982), blind signatures,
which would verify the sender of a message while obscuring its content (Chaum, 1982), and mix
networks, which would allow groups of individuals to send encrypted messages to a server, where
they would be scrambled and forwarded on to their destinations without a trace of the original
owner (Chaum, 1988). Chaum’s work moved beyond the concerns of privacy outlined by Diffie,
Hellman, Rivest, Shamir and Adelman, instead foregrounding the potential for encryption to be
used in ways that could ensure anonymity.
These ideas were exciting to a nascent group of cryptophiles who hung out around the Bay
Area. At Stewart Brand’s Hacker Conference and other, similar gatherings, May, Eric Hughes,
John Gilmore, Arthur Abraham, Hal Finney, Perry Metzger and others began to discuss ideas about
how these ideas could be translated into real world programs. As May describes it, “The time was
right in 1992 to deploy some of these new ideas around in the cryptography and computer
communities and reify some of these abstractions” (May, in Vinge, 2001, 37). He and co-founder
Eric Hughes gathered the group – about 15 people in all – together to meet in person at Eric Hughes’
Berkeley home. The group first met in mid-September of 1992, and swiftly built an online presence
through an e-mail listserv set up days after the initial meeting.
The listserv played an important role early on in establishing the cypherpunks not only as
a Bay Area community, but a network that rapidly expanded to involve participants from across
the US and internationally. As May put it:
The cypherpunks group is…a good example of a “virtual community”. Scattered around
the world, communicating electronically in matters of minutes, and seemingly oblivious of
local laws, the Cypherpunks group is indeed a community; a virtual one, with its own rules
and its own norms for behavior (May, in Vinge, 2001, 37).
137
The rules and norms he refers to eventually became a notorious aspect of the group, known for its
freewheeling and often acrimonious debates.
The Cypherpunks Go Public
Despite their initially small number, the cypherpunks drew public attention quickly in their
early days, featuring in media coverage in WIRED magazine, Mondo 2000, and The Village Voice.
Within a few months of the group’s first meeting, May, Hughes and John Gilmore appeared on the
cover page of WIRED in an article titled “Rebels with a Cause (Your Privacy)”, wearing leather
outfits, holding an American flag and sporting white masks to hide their faces. In vivid prose,
Hackers author Steven Levy described the work of the group by saying “It's the FBIs, NSAs, and
Equifaxes of the world versus a swelling movement of Cypherpunks, civil libertarians, and
millionaire hackers. At stake: Whether privacy will exist in the 21st century” (Levy, 1993).
Figure 4.1: WIRED “Crypto Rebels” cover, February 1, 1993. Source: WIRED Magazine.
138
Levy described the work of the group as largely political, asserting that though a “world where the
tools of prying are transformed into the instruments of privacy” was technically possible, there
were legal obstacles to overcome: “some of the most powerful forces in government are devoted
to the control of these tools…there is a war going on between those who would liberate crypto and
those who would suppress it” (Levy, 1993).
Levy’s article was followed, a few months later, with another piece by WIRED Executive
Editor Kevin Kelly, this time in the Whole Earth Review, describing cypherpunk meetings and
listserv activities in greater detail:
On a couple of Saturdays, I join Tim May and about fifteen other cryptorebels for their
monthly cypherpunk meeting, held near Palo Alto, CA. The group meets in a typically
nondescript office complex full of small hitech startup companies. It could be anywhere in
Silicon Valley. The room has corporate gray carpeting and a conference table. The
moderator for this meeting, Eric Hughes, tries to quiet the cacophony of loud, opinionated
voices. Hughes, with sandy hair halfway down his back, grabs a marker and scribbles the
agenda on a whiteboard. The items he writes down echo Tim May's digital card: reputations,
PGP encryption, anonymous remailer update, and the Diffey-Hellmann [sic] key exchange
paper.
[…]
The main thrust of the group's efforts takes place in the virtual online space of the
Cypherpunk electronic mailing list. A growing crowd of cryptohip folks from around the
world interact daily via an Internet "mailing list." Here they pass around code in progress
139
as they attempt to implement ideas on the cheap (such as digital signatures), or discuss the
ethical and political implications of what they are doing. (Kelly, 2003).
These articles helped not only to publicize the group’s activities, but to brand them as a
group of “crypto-rebels”, appealing to a certain counter-cultural activist aesthetic. They also
helped popularize ideas about the potential value of encryption, online privacy and anonymity to
a networked society – adopting a view that Larry Lessig would distil years later into the maxim
“code is law” – that technology could regulate, could embed the values that its creators see as
fundamental in technical architecture, and that this architecture could delimit or even overcome
attempts by regulators to control us (Lessig, 1999).
For the cypherpunks, the matters of greatest concern were tied to a unique set of problems
posed by networked connectivity. As Kelly put it, “In a world where everything is connected to
everything, where connection and information and knowledge are dirt cheap, then disconnection
and anti-information and no-knowledge become expensive” (1993). The cypherpunks feared that
a society in which disconnection was too expensive would lead to the formation of a surveillance
state in which individual freedom would be substantially curtailed.
In these early days, the cypherpunks emphasized the need to build projects that practically
applied the ideas developed within cryptography circles: as co-founder Eric Hughes put it,
“cypherpunks write code” as a primary means of combatting government regulation. They built
software that helped popularize PGP by making it compatible with different operating systems,
developed anonymous remailers, using Chaum’s ideas about mix networks to make it possible to
send messages without any trace of the identity of the sender, and tested out systems for using
digital cash. Some of the technologies they built proved popular: for example, the anonymous
remailer anon.penet.fi went into widespread use as a way to post anonymous messages to Usenet
140
forums, at its peak handling 4,000 messages per day. Others only reached fruition years later:
Wikileaks, Bitcoin, and the Silk Road are all legacy projects that derive from proposals first
circulated on the cypherpunk listserv.
However, the longstanding legacy of the cypherpunk community proved to be the potency
of their ideas: the magazine articles, circulation of manifestos, and engagement in political
opposition to regulatory battles did more to spread the movement than the group’s experiments
with anarchy or code. As I describe in the next chapter, the tenor of cypherpunk ideas, rather than
the technology of cypherpunk code, had a lasting effect in the memory of the cryptographic
community.
Cypherpunk Politics
As a collective, the cypherpunks are most notable among the variant strains of hacker
communities (Jordan, 2016) in their overt embrace of the political dimensions of technology.
While some hackers abstain from political debates unless linked directly to their productive
freedom (Coleman, 2013), political ideals are core to the cypherpunk community and have been
since its formation. Thus, while like other hobbyist groups the cypherpunks spent a lot of time
sharing code and arguing over the technical details of the cryptographic tools they built, most of
their effort was spent envisioning the kinds of institutional and political goals for encryption
technologies in an increasingly networked world.
The founders of the cypherpunk listserv, Timothy C. May and Eric Hughes, were key
proponents of these discussions. May and Hughes envisioned encryption as a link to a new political
future realized through technology: one in which cryptography provides for individual
141
empowerment and protection against the surveillance state. May coined the term crypto-anarchy
to capture this political philosophy, which May describes as “the cyberspatial realization of
anarcho-capitalism, transcending national boundaries and freeing individuals to consensually
make the economic arrangements they wish to make” (May, in Vinge, 2001). By evoking the term
‘cyberspace’, May is suggesting an ideology that exists on a new frontier, one in which individuals
are freed from the bounds of physical form and state apparatus. Like his contemporary John Perry
Barlow, who first coined the term, May is building on the evocative potential of networked
technologies to offer us “a taste of rebel-hero selfhood”, a new and romantic utopia (Streeter, 2011,
123). But where Barlow’s cyberspace is more philosophical than it is practical, May’s formulation
of crypto-anarchy is markedly determinist: the political and economic goals of limiting state power
and bolstering the individual are realized through coding projects and thought experiments
designed to bring these new ways of being into existence.
Crypto-anarchy espouses a view that encryption can fundamentally disrupt sovereignty,
destabilizing power relations within the nation-state system to create a new world in which
individuals and communities may be empowered to counteract the institutional apparatus of the
state. This idea – that technology can disrupt politics – is not entirely original; it builds on
libertarian antecedents that were widespread in the Bay Area computer counterculture during the
late 1980s and early 1990s. Richard Barbrook and Andy Cameron described the “Californian
Ideology” held by “writers, hackers, capitalists and artists” in the area as follows:
Information technologies, so the argument goes, empower the individual, enhance personal
freedom, and radically reduce the power of the nation-state. Existing social, political, and
legal power structures will wither away to be replaced by unfettered interactions between
autonomous individuals and their software. (Barbrook & Cameron, 1995, 5).
142
But the cypherpunk variant adds an air of inevitability to the notion, grounding it in claims that the
principles of mathematics themselves are such that the technology of encryption can and must
inevitably disrupt institutional power.
As Julian Assange put it in his book Cypherpunks: Freedom and Future of the Internet in
a section titled “A Call to Cryptographic Arms”,
the universe, our physical universe, has that property that makes it possible for an
individual or a group of individuals to reliably, automatically, even without knowing,
encipher something, so that all the resources and all the political will of the strongest
superpower on earth may not decipher it. And the paths of encipherment between people
can mesh together to create regions free from the coercive force of the outer state. Free
from mass interception. Free from state control. (Assange, 2012, 5).
These discursive formulations are misleading when read on their own, however, because they elide
the actual process of achieving change. In practice, both Assange and May espouse a vision of
crypto-anarchy that I have described elsewhere as a kind of “survival of the cryptic”: they position
technical savvy as a means of gaining power and authority.
32
From this perspective, learning to
use encrypted technologies in sophisticated ways enables people to engage in anonymous
transactions that rebalance power relations away from institutions and toward individuals. Each of
us is on our own to make of ciphers what we may.
32
The notion that the technologically ignorant are deserving of deception at the hands of those with
greater skill is a discourse with a long precedent; for example Carolyn Marvin (1988) writes about how
electrical engineers at the end of the 19
th
century used technical language to demarcate and bound their
technical authority. Similar ideas circulate within some circles of hackers as a justification for exploiting
the weaknesses of others – as I will show, a practice that some cypherpunks overtly sought to distinguish
themselves from
143
Claims like “privacy for the weak, transparency for the powerful” are thus situated in
atomized, libertarian terms rather than broadly democratic ones. This focus on the individual is at
odds with the kinds of mutualistic relations positioned in association with public key cryptography
in the 1970s. It overlooks that the use of encryption for both communications and transactions
requires someone on the other end to share messages with. However, as I will illustrate, not all
cypherpunks espoused this vision, nor did all those who believed in crypto-anarchy understand it
the same way.
Cypherpunk Manifestos
33
A set of core texts written by May and Hughes and shared on the listserv provides an entry
point to parsing the distinct variants of crypto-anarchy within the group. Over the course of several
manifestos, May and Hughes outline two possible political futures for the group: a determinist,
libertarian strain, and a mutualist, recursive strain. Analyzing these manifestos provides an
indication of the breadth of cypherpunk politics and conflicting views about the politics of
cryptography as it developed in the early years of the community.
The first and most well-known of these texts is the Crypto Anarchist Manifesto, which was
published to the list in 1992 but circulated previously at Brand’s Hacker Conferences and at the
cypherpunks’ first in-person meeting. In the Manifesto May remarked,
Computer technology is on the verge of providing the ability for individuals and groups to
interact with each other in a totally anonymous manner. Two persons may exchange
33
Portions of this section were previously published in West, S. (2017). Survival of the Cryptic. Limn, 8.
https://limn.it/survival-of-the-cryptic/?doing_wp_cron=1516493357.4484739303588867187500.
144
messages, conduct business, and negotiate electronic contracts without ever knowing the
True Name, or legal identity, of the other…. These developments will alter completely the
nature of government regulation, the ability to tax and control economic interactions, the
ability to keep information secret, and will even alter the nature of trust and reputation”
(May, 1992).
May envisaged the development of a trade in national secrets that would make it possible for
whistleblowers to uncover corruption in government without risking harm to their physical selves.
He was inspired by texts like the 1985 science fiction novel Ender’s Game by Orson Scott Card.
In the book, two children, a girl and a boy, post political essays anonymously to a global
communication system under the pseudonyms Demosthenes and Locke, winning over policy
experts and ascending to the world stage despite their youth. Anonymity enabled them to overcome
the disparities in power and reputation accorded to their age: it leveled the playing field such that
arguments were judged based on the content of their information rather than by the reputation of
the speaker. May’s vision builds upon this by seeking to establish a market in information
separated from its institutional context. In so doing, May thought anonymous leaks could check
the power of institutions like governments and corporations, redistributing it back to individuals.
Though Card’s vision is very nearly an embodiment of Habermasian discourse, May’s
interpretation is more akin to a capitalist marketplace of ideas than a rationalized public sphere.
“Combined with emerging information markets, crypto anarchy will create a liquid market for any
and all material which can be put into words and pictures,” he said, leaving it up to the invisible
hand of the market to define the value of that material (May, 1992).
He goes on to describe the political changes on the frontiers of an anarchic future: “just as
the technology of printing altered and reduced the power of medieval guilds and the social power
145
structure, so too will cryptologic methods fundamentally alter the nature of corporations and of
government interference in economic transactions” (May, in Ludlow, 2001). May’s vision is
perpetually grounded in a strategy for change focused on economic transactions between
individuals outside the purview of the state.
May offers up depictions of the crypto anarchic future throughout the manifesto: a world
in which “two persons may exchange messages, conduct business, and negotiate electronic
contracts without ever knowing the True Name, or legal identity, of the other”. Anonymity would
make reputations of particular importance, thus disintermediating the role of the state and reducing
its control over the individual. The Manifesto acknowledges the likelihood of dystopian
consequences for the changes, such as the spread of national secrets, sale of illicit goods and use
of cryptography by criminals. But while describing this future as inevitable, it welcomes the
possibilities for increasing individual sovereignty.
Eric Hughes provides another vision for the group in the Cypherpunk’s Manifesto, which
he shared with the group over the listserv in October 1992. The first half of the Cypherpunk’s
Manifesto reads more like a treatise on the meaning of privacy for the electronic age, shifting
registers to outline a statement of purpose for the group only in its last few paragraphs:
Cypherpunks assume privacy is a good thing and wish there were more of it. Cypherpunks
acknowledge that those who want privacy must create it for themselves and not expect
governments, corporations, or other large, faceless organizations to grant them privacy out
of beneficence (Cypherpunk Mailing List, Oct. 5, 1992).
He shifts emphasis away from the political possibilities of crypto-anarchy and toward the
need to protect privacy for individuals because of the proliferation of electronic communications.
146
In contrast to May, Hughes’ vision for the group is grounded in a strategy for change that relies on
mutual social relations: “For privacy to be widespread it must be part of a social contract…Privacy
only extends so far as the cooperation of one’s fellows in society”. This directly contradicts May’s
arguments about “the responsibility of secret-holders” – instead, Hughes stakes his claim on the
notion that cypherpunk politics would of necessity be a cooperative effort.
He emphasizes the practical work of the group in addition to a set of political goals:
“Cypherpunks write code. They know that someone has to write code to defend privacy, and since
it’s their privacy, their [sic] going to write it”, he says. In so doing, Hughes is asserting that the
group must do more than just wax lyrical on the listserv; cypherpunks should work collectively to
build the tools needed to make networks safe for privacy.
The alternatives Hughes seeks to produce are premised on a vision of privacy unlike the
market-driven, transactional relations outlined by May. For Hughes, the ultimate goal of the
cypherpunks’ work is to make privacy – defined here as “the power to selectively reveal oneself
to the world” – possible through anonymity. As such, Hughes’ vision for privacy is more akin to
what Helen Nissenbaum (2010) has described as a “contextual” approach to privacy – neither a
right to secrecy nor to control, but to the appropriate flow of personal information. Prefiguring the
kinds of concerns that arose alongside social networks, Hughes is concerned about privacy as the
management of one’s visibility.
Between these two visions are a range of possible interpretations of the political meaning
of cryptography, and each leads to different types of problems for cypherpunks to work on. For
Hughes, cryptography’s purpose is primarily as a means of secret communication. Thus, the
cypherpunks’ contribution would be to build on the invention of public key encryption in order to
147
democratize its availability and to make technologies that would enable average individuals to
communicate with one another in secret.
By contrast, May’s statement suggests he sees cryptography as being primarily about
sovereignty: it is a mechanism to address power discrepancies between individuals and institutions,
and rebalancing them by bolstering the capacities of the individual to protect themselves against
the state. Though May’s vision implicitly evokes the need for privacy, it exceeds it. Instead, his
focus is on cryptographic projects that make new kinds of transactions possible, foregrounding the
individual as homo economicus.
Neither interpretation easily conflates to a singular political philosophy, but the distinction
between libertarianism and communitarianism may be helpful in pointing toward a second area of
differences between the two. May’s crypto-anarchy is explicitly a variant of libertarianism, a point
he reinforces in the Cyphernomicon, the FAQ about the cypherpunks compiled by May in 1994.
In the Cyphernomicon, he says libertarianism “is undeniably the most represented ideology on the
list, as it is in so much of the net….If other major ideologies exists [sp], they are fairly hidden on
the Cypherpunks list”. Under the section “The Nature of Crypto Anarchy”, May writes:
+ Motivation
- Vernor Vinge's "True Names"
- Ayn Rand was one of the prime motivators of crypto
anarchy. What she wanted to do with material technology
(mirrors over Galt's Gulch) is _much_ more easily done
with mathematical technology.
Inspired by Rand, his crypto-anarchy aims to circumvent the political process by using technology,
avoiding the need for institutional change by enabling individuals to create change themselves.
148
Hughes’ statement suggests his approach to cypherpunk politics similarly emphasizes
individual empowerment, but recognizes that achieving the goals set out by the group would only
be possible through collective work. He expresses the need for cooperation, a “social contract”
through which people use anonymous systems “for the common good”. As such, his expression
comes closer to the communitarian vision of a connected future, one in which small collective
groups are empowered to govern themselves. Importantly, Hughes’ collectivism is grounded in a
technological imaginary that Lana Swartz describes as infrastructural mutualism: “to ‘write code’
was to produce not just technology but to produce the protocols through which privacy, and
therefore communicative reality, would be executed” (Swartz, 2018, 4).
There existed an even broader diversity of views on the political future for cryptography
among cypherpunks on the list. The Cyphernomicon emphasizes throughout the text that the group
had divergent views about the politics of encryption tools. In these early days of the World Wide
Web, it was already clear to the cypherpunks that the use of encryption would be important, but
they were often sharply divided around their role in promoting its use. This conflict meant that the
tensions outlined in the manifestos – between libertarian and communitarian views, between the
cypherpunks as crypto-anarchists and as mainstream privacy advocates – persisted throughout the
group, ultimately resulting in what May describes as:
an emergent, coherent set of beliefs which most list members seem to hold:
* that the government should not be able to snoop into our affairs
* that protection of conversations and exchanges is a basic right
* that these rights may need to be secured through
_technology_ rather than through law
* that the power of technology often creates new political
realities (hence the list mantra: "Cypherpunks write code")
149
Interpretative Flexibility and Crypto-Anarchy
An analysis of the first several months on the Cypherpunk list can provide insight into these
wide-ranging political debates. As I will show, an ideological conflict manifested early on in a
division over how the group should publicly represent itself, and whether it should focus on
asserting the values of crypto-anarchy or more mainstream ideas about privacy. The debates I
describe in this section, which spanned the fall of 1992, represent only one of many within the
group, which became notorious for its heated political discussions.
In this section, I analyze discussions from the Cypherpunk Archives, a collection of the
emails sent between members of the cypherpunk listserv. Due to the very high volume of emails,
I am only analyzing the emails in the first year of the listserv’s existence, dating from September
21, 1992 to December 22, 1992. I was somewhat surprised to find that there was no introductory
message sent out to the group when the listserv was formed; it picks up almost as though its
participants were in the middle of a conversation.
This leaves open the possibility that the archive is missing some early messages, however
the account of the formation of the group in Steven Levy’s book Crypto seems to suggest there
may not be many; Levy gives an account of the first meeting held in Berkeley, California on
Saturday, September 19, 1992, two days before the first email in the archive. Roughly twenty
cryptophiles, who Timothy May later said were “some of the brightest folks we knew from the
annual Hackers conference and from the Bay Area computer community” (May, in Vinge, 2001),
met at Eric Hughes’ house to discuss encryption and play a crypto-anarchy game. Shortly
afterward, Hughes set up a listserv, which over 100 people signed up to within a few weeks (Levy,
150
2001). As of November 26, 1996 the list had 168 individual subscribers and six gateways to other
local redistribution lists, which are unspecified but based on conversations in the emails likely
include the Extropians list, which was designed for people who believe in transhumanism and the
possibility of immortality through artificial intelligence and robotics, and FidoNet, a network of
bulletin board systems popular among Internet users in the early 1990s. Though the archive does
not provide any definitive insight into the make-up of the group, the discussions on the listserv
during this period include only one participant who identifies as female, Jude Milhon (“St. Jude”),
with the remainder of the participants being men. All of the most prominent cypherpunks I could
identify (people like May, Hughes, and John Gilmore) were white men between their 20s and 40s,
many of whom worked for technology companies, suggesting a relatively homogenous group.
Given the high volume of text on the listserv, I used MAXQDA, a qualitative analysis
software tool, to help me analyze the emails. I used in vivo coding in order to tag items of interest,
including different forms of encryption technologies, expressions of different types of governance
values, as well as the contributions of particularly active cypherpunks, among other things. In
addition, I created memos to collect memorable quotes. I then used thematic analysis to explore
the issues that emerged from the discussion, focusing primarily on the social and political
imaginaries of the community around the use of encryption. In order to better understand this, I
emphasize moments of debate through which these imaginaries either become crystallized or
contested. In addition, I paid close attention to where members of the listserv advocated for the
allocation of resources, whether human or otherwise – what projects were considered important,
and which were unimportant? How did the community organize itself around key projects? A third
aspect of my analysis was a focus on the social relationships between members of the cypherpunk
community. Who has power and how is it wielded? How are new members brought into the fold?
151
In this, I aim to better understand how social dynamics influenced what was deemed to be
important or unimportant by other cypherpunks.
Inherently in making these choices I’ve excluded other possible themes. One in particular
worth flagging is the exclusion of emails where members of the listserv parse code or technical
scenarios for others to consider. My choice in excluding them is in no small part motivated by their
technical. However, my aim in this chapter is to foreground the political values behind these
technical artifacts – thus making an initial foray into better understanding the influence of this
community on encryption today.
Listserv Practices
The divide among the cypherpunks initially emerged out of fears expressed by some
participants over proper listserv activities, particularly whether it should be used to share classified
materials. In the early days of the list, a cypherpunk going by the name Vanguard posted classified
information shared by a friend in the Air Force about military cryptography devices. The sharing
of this information on the listserv, which was at that time public but not well-advertised, raised
concerns among many participants about the legality of what they were up to. Paranoia was already
in the air for many on the list, following a series of raids on hackers by the FBI and Secret Service
in the early 1990s. While the cypherpunks weren’t breaking in to any systems (or at least not under
the auspices of the group), the legality of developing encryption tools without the cover of
intelligence agencies remained under dispute – and some cypherpunks feared discussions on the
list could potentially be used as evidence against them.
152
A cypherpunk named George Gleason said that though the information was of theoretical
interest, “posting it here really gets me majorly uncomfortable…Posting that kind of thing, even
in a general way, could be very dangerous” (Cypherpunk Mailing List, Nov. 27, 1992). John
Draper, a well-known phone phreaker who had already gone to jail twice for his exploits under the
moniker Captain Crunch, echoed Gleason’s concerns, saying “As an ex-air force person, they can
get really nasty if they want” (Cypherpunk Mailing List, Nov. 27, 1992).
What makes these concerns particularly interesting is that, more so than other hacker
collectives, the cypherpunks were unusually primed to think about confidentiality. The group was
active in the development of anonymous remailers, a tool to enable the sending of encrypted email
with no possible way to identify the sender. Despite this, few cypherpunks availed themselves of
anonymity on the list – most identified themselves with what appear to be their real names and
email addresses. And although the cypherpunks were actively creating technologies that should
ostensibly reduce fears of recursion by authorities, the group nevertheless exhibited trepidation
about the possibility of getting in trouble with law enforcement – fears likely reinforced by their
use of their real identities to discuss matters of unclear legal status.
The cypherpunks were not only concerned about the legality of listserv discussions, but
also the public perception of encryption technologies themselves. The group met regularly in
person to play a crypto-anarchy game devised to simulate the kinds of scenarios that would play
out in their code. Following the second meeting, a cypherpunk named Ravi Pandya sent a note out
to the list about the subject of the game:
Talking to a couple of people who had been to Saturday’s meeting, I was deeply disturbed
by the choice of subject for the simulated privacy game – trying to purchase illegal
drugs….Let me first state that I do favor the legalization of drugs, and I am a committed
153
libertarian (as I suspect are many of the people in this group). However, I think it is unwise
to tie cryptography in with this issue. Communications privacy is too important to take
the risk that it will get caught up in the current hysteria of the New Prohibition” (Emphasis
in original, Cypherpunk Mailing List, Oct. 12, 1992).
Pandya advocated that the cypherpunks evolve their interpretation of their political role
beyond a simple technocratic view, and consider the way they presented themselves publicly – to
treat it as a marketing campaign in addition to a development project.
I think that if we are committed to getting cryptographic technology in wide use, and
spreading true privacy and information security throughout the world, then we need to take
careful account of the political and social factors, as well as the technical. (Cypherpunk
Mailing List, Oct. 12, 1992).
His view was echoed by several other cypherpunks, who said they did not want to alienate people
by choosing controversial topics, including Hughes himself (who was responsible for picking
drugs as the subject of the game).
When in November the Secret Service raided a meeting in Washington, DC for the hacker
magazine 2600, the cypherpunks’ concerns about public perceptions reached a new high – some
of the cypherpunks were convinced they would be next to be targeted. Bob Stratton lamented
media representations of the raid, saying:
today’s article about the event was yet another example of the media promoting hysteria
about technologically competent people…It makes me sick, and I think that the crypto
community (the part NOT employed by DoD) is next for this sort of misrepresentation”
(Cypherpunk Mailing List, Nov. 12, 1992).
154
Perry Metzger vehemently disagreed, saying:
Pardon, but isn’t 2600 magazine a magazine by crackers for crackers? 2600 is named after
the frequency of the disconnect tone used in blue boxes, isn’t it? What I’m afraid of is that
I will get confused with one of them – I’m not sure they are necessarily being
misrepresented” (Cypherpunk Mailing List, Nov. 13, 1992).
Pseudonymous user Anthrax responded by saying:
2600 is named after the frequency of the disconnect signal used in the telephone system
for several decades. But so what? Doesn’t PGP violate patent laws (or surely come pretty
close)? Why would you want to be associated with PGP and its use, then?” (Cypherpunk
Mailing List, Nov. 13, 1992).
Despite Metzger’s dismissal of the raid as a legitimate attack on the illegal activities of crackers,
Draper expressed trepidation over the use of encryption. “According to the disclaimer [for the Mac
version of PGP], PGP is ‘contraband’? Will I get the ‘thought police’ busting down my door for
using it?” (Cypherpunk Mailing List, Nov. 12, 1992).
Responding to concerns over the behavior of the hackers, George Gleason suggested the
development of an ethical code for the cypherpunks, calling on the group to develop a set of
cultural values grounded in civil disobedience rather than “wrecking”. Perhaps if the cypherpunks
got ahead of the law and established a progressive set of principles, they might be able to influence
the likelihood of law enforcement restricting their activities. While empathetic to the general
objectives Gleason proposed, Tim May was largely skeptical about their success, likening them to
a “Young Men’s Crypto Association” designed around preaching “hacker morality”. As he put it:
155
This YMCA will perhaps teach some set of values to perhaps 90% or even 99% of the
hacker community it preaches to. But what of the rest? A case can be made that such
preaching will _energize_ this minority into action, if only to poke a stick into the eye of
society (Cypherpunk Mailing List, Nov. 20, 1992)
May further argued that the formation of such a code could be used as a curb on free expression –
noting that “many of the ‘crypto anarchy’ views I've been espousing for several years now have
been seen by some as grossly immoral and dangerous. Should the YMCA (the Young Men's
Crypto Association, remember) argue _against_ such ideas?” (Cypherpunk Mailing List, Nov. 20,
1992). May turned the discussion back on its head, arguing that the techniques of crypto and
privacy would themselves serve as a more effective form of protection against other hackers than
would “moral lectures against thievery or fraud”.
Gleason pushed back on May’s cultural brokering over whether a cultural or political
solution to these problems would ultimately be effective, arguing that some of the problems the
cypherpunks aim to address were in fact fundamentally social. Gleason’s critique cut to the heart
of May’s crypto-anarchist beliefs:
Re crypto-anarchy, at heart I’m in favor; in practice I’m in favor; and I see no contradiction
in promoting an internalized sense of ethics in these areas. The underlying deep problem
is we live in a society which is addicted to the emotions that go along with violent acts.
Ultimately that problem needs to be addressed, and it’s beyond the scope of this group to
do that, except as each of us can do things in contexst [sic] of our civic lives (Cypherpunk
Mailing List, Nov. 21, 1992).
156
The underlying divide between May and Gleason in this debate is not necessarily over the values
of crypto-anarchy itself, but rather May’s technocratic approach to realizing them. Gleason directly
called out the claims May made about technical solutions, saying that it was erroneous to think
that the technical problems would be easier to solve than the social ones.
Gleason was not alone among cypherpunks in taking issue with the emphasis placed on
anarchy on the listserv. Others attempted to reframe the politics of the group around an alternative
political philosophy, some going as far as to refuse anarchic politics in principle as well as in
practice. Phil Karn responded to Gleason’s remarks by saying:
I find myself getting a little uncomfortable with some of the more anarchistic ideas
expounded in this and similar groups. My interest in cryptography is very simple. I’m not
interested in overthrowing the government by force…I am very interested, however, in
cryptography’s enormous potential to protect individual privacy,
concluding his argument with the statement, “Just as good fences make good neighbors, we may
well find that in the hands of the people, good cryptography will make for good government. That’s
why I find cryptography so interesting, and that’s how we should sell it to the public” (Cypherpunk
Mailing List, Nov. 27, 1992). Karn is arguing here that cryptography can make possible privacy
protections that do not rely on the force of law. He expresses concern that the law requires “good
will” on the part of government that had been lacking – thus, encryption technologies would allow
them to protect the rights afforded to them under the Constitution.
Tom Jennings, the founder of the BBS FidoNet expressed a similar opinion, saying that the
“other stuff” dragged the group off topic. Jennings described the efforts of the cypherpunks as
“infowar”, calling on the community to call their work “privacy” rather than “encryption” for its
157
positive connotations. “PRIVACY is age-old, and crosses all boundaries, and is inherently anti-
statist. ENCRYPTION is cloak’n’dagger, late night movies, WWII, espionage, etc” (Cypherpunk
Mailing List, Nov. 1, 1992).
These discussions suggest a vibrant community of cypherpunks who had alternative ideas
about what encryption was and what it could do, and who were constantly engaging in a process
of negotiation over its cultural meaning, even when they made claims to the contrary. They sought
to reframe the work of the community, some of whom expressed an interest in moving away from
the radical crypto-anarchist philosophy of May in favor of a more moderated privacy movement
grounded in practices of civil disobedience. Some also pushed back on the dominant technocratic
vision for what encryption is and what it can do. This socially grounded reframing of encryption
was double-sided; on the one hand, encryption would have the power to enhance democratic social
practices, particularly by bolstering individuals’ right to privacy. On the other hand, it reflected
concerns over the potential for “social engineering” of encryption technologies, exhibiting fears
that imperfect use of technologies could lead to incursions by authorities.
Soon after, May posted a message in response to these threads, titled PARANOIA, THIS
LIST, AND APPROPRIATE TOPICS. May began his lengthy note by saying:
Recently, several people have expressed concern that the proper message is not being
conveyed, even that a subversive and seditious message is often presented on this list. Some
have cited discussion of military cipher systems as being too dangerous for us to discuss,
while others have expressed some discomfort at the "anarchist" flavor of some postings.
Well, folks, we all have different axes to grind. Since this is an open, unmoderated list,
little control of content can be expected (Cypherpunk Mailing List, Nov. 28, 1992).
158
In his email, he depicted the listserv itself as an example of crypto-anarchy in action; pushing back
against advocacy by members who wanted to encourage the formation of a set of publicly
presentable values by saying that “little control of content can be expected”, though “those who
want to sell ‘privacy’ are free to”. Despite these laissez faire claims, he strongly asserted a value
for leaving the discussion open in his closing statement:
Anyone uncomfortable with the free discussion of ideas and technologies on this list,
anyone who fears personal or professional liability for views expressed by _others_ on this
list, should probably unsubscribe from the list. I certainly don’t intend to stop writing about
the areas that interest me (Cypherpunk Mailing List, Nov. 28, 1992).
The message was a moment of closure for the discussion – it made clear that though not
all cypherpunks would necessarily espouse the philosophy of crypto-anarchy, it would guide the
norms for discussion within the group. Rather than engage in a tightly coordinated effort to
advocate for privacy, the listserv would serve as a place where cypherpunks would discuss radical
ideas, share classified information, and, though encryption certainly could in theory be associated
with privacy, they certainly would not abide by any communally designated code of ethics.
Importantly, the reason not to do so would not be through a sense of consensus among the group
but rather a determinist understanding that these things would happen because there was no
technical means to prevent them. The group would aspire to embody crypto-anarchy in action.
Conclusion
To conclude, I’d like to transition to consider the broad significance of these findings for
the cryptographic projects that are the focus of the ethnographic work I describe in chapter five.
159
The ideas of crypto-anarchy and the legacy of the cypherpunks loomed large throughout my field
work, but I found relatively little evidence of crypto-anarchy in practice among the network of
cryptographic communities I observed. This seems to suggest that the practical work of the
cypherpunk community – the technologies they built as well as crypto-anarchy as a form of
community practice – failed to keep pace with the ideas they elaborated through their
cryptographic imaginary. The product of a successful effort by the cypherpunks to promote a new
set of values for cryptography, the imaginary they constructed through crypto-anarchy developed
a social cache and cultural significance that far outlasted cypherpunk code.
Though crypto-anarchy as it exists today is generally understood in the terms set out by
Timothy May, studying early discussions on the cypherpunk listserv and analyzing core
cypherpunk texts revealed a wider range of interpretations within the community and a fair degree
of resistance to the adoption of crypto-anarchy wholesale by all community members. Though the
Cyphernomicon outlines an ostensible openness to alternative viewpoints and welcoming of debate
on the listserv, from a practical stand point, crypto-anarchy came to define both the code
production of the group – what projects the community centers on building and how they are
designed – as well as its value systems and forms of self-governance.
A good illustration can be seen in how cypherpunks dealt with spam. The subject of a long-
term controversy, the administrators of the cypherpunk listserv resisted moderating spam or
harassing speech when posted by other users, refusing requests to implement filters on the
principled stance that this would constitute a form of censorship. He eventually backtracked, and
in 1996 began moderating the messages given a high volume of irrelevant posts – but then received
so many attacks from list members for doing so that he shut down the group entirely and called on
the cypherpunks to find a new home (Frauenfelder, 1997).
160
Though this is a decision with a fairly limited impact – it affected only the 1,400 or so
subscribers to the listserv at that point – it is notable for another reason; the treatment of principles
as outweighing long-term objectives. In many respects, members of the cypherpunk community
focused on principled cases – for example, considering worst-case scenarios for security threats or
providing as many options to users as possible to grant them maximal freedom – over creating
usable software. This meant that the cryptographic technologies actually produced by the
cypherpunk community were, by and large, challenging to use, and prevented widespread adoption
by the nascent community of Web users – thus hindering the long-term goal of protecting online
privacy. This was exacerbated by the uncertain legal status of cryptographic tools like PGP, which
was also an inhibitor of adoption.
In the early stage of my fieldwork, I had a conversation with a technologist who considered
himself a former cypherpunk. He’d joined the listserv as a teenager, and avidly followed
discussions on the list, taking part in heated debates himself and going on to work as a technologist
for an advocacy organization that aligned itself with cypherpunk ideas. We discussed the
significance of the cypherpunk community, and how they had experienced a resurgence in
popularity after the Snowden revelations, which proved many of their fears of widespread
government surveillance right. I observed a hint of melancholy in his chronicling of the group’s
history, and asked him what he thought about the long-term impact of their work. The cypherpunks
were too optimistic, he lamented – they had too much faith in the potential of technologies to
change the world.
Though he didn’t say it explicitly, this made me wonder whether the cypherpunks’ faith in
technology was also tied to a lack of faith in average users. Privileging expertise, and making
principled, but hard-line decisions such as Gilmore’s stand on spam filtering and May’s position
161
on abusive listserv discussions, the cypherpunks developed a narrow construction of who the ‘user’
of cryptographic tools would be. In the next chapter, I explore the implications of this construction,
and the work of contemporary cryptographic projects devoted to broadening the cryptographic
community.
162
Chapter 5: Transforming the Cryptographic Community
In the 21
st
century, the social imaginary around cryptography is in the process of transformation.
Like all imaginaries, it builds on ideas that came before it; borrowing from crypto-anarchy the
objectives of privacy protection and carving out spaces for autonomy from the state. But unlike
crypto-anarchy, this new formulation sees the pursuit of autonomy as a collective enterprise,
grounded in the values of social justice and the support of marginalized communities rather than
the defense of individualized civil liberties. This chapter traces through a new cryptographic
imaginary in formation, examining both the ideas and material practices of members of a
community dedicated to the preservation of human rights in the digital age.
Methodological Framework
I make a shift in this chapter from a largely historical and interpretive mode of analysis to
incorporating ethnographic methods, combining archival research with participant observation
data gathered from field work conducted between 2015 and 2018 at sites including the Chaos
Communication Congress, Internet Freedom Festival, RightsCon and Crypto Summit 2.0. This
perhaps merits a lengthier introduction than the other chapters, as I see this shift in methods as
relevant to the analytical contributions of this chapter.
I argue that the crypto community in the present day is engaging in the process of
reformulating the cryptographic imaginary around a new set of principles. This new imaginary
draws, consciously and unconsciously, on a tradition in feminist theory that challenges the strict
drawing of distinctions between public and private space. In keeping with the concept of
163
intersectionality, it acknowledges that surveillance is not experienced equally by all members of
society; thus encryption can play an important role for marginalized communities that are
disproportionately exposed to the gaze of surveillance by corporations and the state. It can enable
the formation of what Catherine Squires has called enclave publics – publics which are hidden
from the view of the dominant public (Squires, 2002). The work of this feminist cryptographic
imaginary, thus, reformulates many of the principles of crypto-anarchy – the valuing of privacy
and pursuit of sovereignty – as a collective, rather than individualized enterprise. This has occurred
in tandem with changes in the crypto community itself, changes that foreground the usability of
cryptographic software, the queering of community spaces and the cultivation of an ethic of care.
The ‘crypto community’ as it exists in the present day is not a stable or consistent entity: it
is made up of an amorphous collective of activists, technologists, and organizations that generally
share a commitment to the protection of privacy, freedom of expression and freedom of association
in the digital age. This community long pre-dates the Snowden revelations, but experienced a
substantial boost in attention and resources following them that make Snowden an important
inflection point for their work. There is no consistent self-identification for the group – some
members may identify as activists and advocates, but many do not. Several organizations describe
themselves as civil society or civil liberties groups, but others resist such classifications, which
carry particular weight in the internet governance community. Many express their work through
the framework of digital rights, or internet freedom, while such terms are deemed US-centric and
problematic by others. A select few still call themselves cypherpunks, but more would resist this
labeling.
Identity is particularly salient to many of the participants in this amorphous collective. In
keeping with this sentiment, I decided to resist seeking to bound or classify them as anything more
164
descriptive than the vague term crypto community.
34
Instead, drawing on the tradition of multi-
sited ethnography outlined by George Marcus (1995), I ‘followed the thing’, tracing cryptography
as it circulates through different contexts at a series of networked field sites. These sites were often
conferences where the crypto community would converge to discuss their work and collectively
participate in building privacy projects. But the conferences were not just physical spaces; they
were permeated throughout by online discourses: Twitter debates, IRC and Slack channels, Signal
group threads and e-mail listservs that preceded, followed, and were even at times
contemporaneous with in-person discussions. Indeed, the ‘field’ as I conceptualize it here
challenges the drawing of clear distinctions between ‘online’ and ‘offline’ spaces; conversations
at conferences would build upon and make reference to online discussions, and debates cut short
by the end of a conference panel would move into online spaces in addition to being carried on in
the lobby. I often found happenings during conferences to be fragmentary and contextless without
the time and effort it took to integrate myself into the online community.
I built relationships with informants that over time provided me access to some of these
online spaces, some of which were public and others which required invitation. My credentials
working for Global Voices and fellowship at the Electronic Frontier Foundation helped me to build
trust as a part of the community and as someone who similarly valued the protection of privacy
and freedom of expression. My ongoing presence at many different conferences also contributed
to this process, though I suspect for some it may have made my role as a researcher a bit harder to
34
Even the meaning of the term ‘crypto’ itself developed an alternative meaning over the two years of my
field work, further problematizing my attempt at classification; it is now used as a shorthand for
‘cryptocurrency’ among digital currency enthusiasts. Though this use of the term received pushback
among the communities I engaged with in my fieldwork (see Franceschi-Bicchierai, 2017), I see this fork
in the meaning of ‘crypto’ as not insignificant: though they are not the focus of this work, the projects of
Bitcoin and Ethereum themselves build on cypherpunk ideas, and the communities that surround them are
more likely to embrace this legacy. Finn Brunton’s forthcoming book Digital Cash: A Cultural History
discusses this in greater detail.
165
parse – I could sense at times that people were unsure how to place whether I was sympathetic to
the perspectives of the ‘suits’ at RightsCon, ‘hackers’ at the CCC, or ‘activists’ at the IFF. I felt
this most urgently when engaged on subjects such as allegations of abuse in the community, where
neutrality and objectivity were neither an ideal nor an option, even given my marginal
insider/outsider position as a researcher.
I was helped in navigating these complexities by the presence of two other researchers who
were also active in their field research in these spaces, though their projects focused on different
subjects. All American women in our late twenties and early thirties, we consulted with one
another about our field work, particularly making time to discuss how to navigate thorny ethical
questions as they were raised over the course of our field work. I believe our proximity to one
another also helped to normalize the presence of researchers in the community who were both
sensitive and committed to community engagement, reducing the kind of antipathy that Hintz and
Milan (2010) described in their field work with similar tech activism groups. In a broad sense, my
experiences in the field were not dissimilar to those of other researchers who have conducted
ethnographic field work in communities focused on technology and activism; I found familiarity
with their accounts to be helpful at many juncture points in navigating the intricacies of
engagement with activist communities (Dunbar-Hester, 2014; Milan, 2013; Portwood-Stacer,
2013; Wolfson, 2014; Juris, 2008).
I initially conceptualized this work as a multi-sited ethnography (Marcus, 1995), but found
that this at times stoked confusion because of my incorporation of both historical/interpretive and
ethnographic modes of inquiry into my research practice. Over time I found myself describing my
work using a new framework developed by Jessa Lingel, “networked field studies”, which she
describes as “an approach that allows for looking across multiple communities and field sites to
166
build a coherent set of analytical claims about the role of technology and everyday life” (Lingel,
2017, 1). Though her article was published toward the end of my experiences in the field,
35
I find
her description particularly relevant given its emphasis on comparative work as a means to making
sense of networked activist communities that lack stable, consistent participants. In addition, as
Lingel writes, “Although indebted to and inspired by the ethnographic method, particularly in the
context of the digital, networked field studies is better thought of as a qualitative method that draws
on ethnographic techniques in order to develop theory across different community contexts” (4).
Networked field studies operates at a meso-level, researching human activity by focusing
on the study of communities as a way of understanding the complicated and often contradictory
ways members relate to one another. I found this to be a fitting approach for understanding the
crypto community as a heterogeneous space with distinctions that can be drawn on lines of
geography, class, gender, and technical expertise (among others). I was able to observe differences,
for example, between the CCC as a primarily European hacker community that is bolstered by the
European Union’s strong privacy and data protection regulations, RightsCon, a conference with a
comparatively strong presence from policy representatives from technology companies and
lawyers from advocacy organizations, and the Internet Freedom Festival, which tends to attract
funders, lower- to mid-level employees of technology projects and civil society groups and
explicitly aims to reach out to and support activists from the Global South. Each of these sites thus
cultivates distinct concerns, repertoires of contention, and analytical frames. Many of my
informants are individuals who move across all of these spaces, but may foreground different
35
At least for the purposes of this dissertation.
167
aspects of their identity at each of them, performing different versions of their selves in each space
while maintaining a relatively consistent ideological position (Goffman, 1956).
There are some distinctions in our approaches, however: though Lingel is most invested in
networked field studies as a method for analyzing socio-technical practices – how activists think
about and incorporate technologies into their work – I was most interested in understanding how
these communities built technologies for activism: who did they think they were building
cryptographic tools for? How did they conceptualize the user of cryptographic technologies? What
did they imagine these technologies could do to change the world? I sought to answer these
questions through participant observation: taking part in sessions where members of the
community presented projects they had been working on, conducting 18 interviews with members
of the community about their work, and engaging in numerous informal group conversations with
community members in between conference sessions. In addition, as previously mentioned, I
participated in conversations with group members on Twitter, over e-mail listservs, and Signal
chats which further informed my understanding of the community. I used thematic coding to
identify issues as they appeared in these field notes, drawing on them to compile both a narrative
timeline that describes the community’s development as well as categorize issues of interest.
The chapter proceeds as follows: I first situate my field work in its historical context by
drawing on a combination of archival research and the accounts provided by community members
to describe the fragmentation of the cypherpunk community and nascence of the internet freedom
movement. Then, I summarize the findings of my field work through discussion of several key
themes that emerged. Finally, I conclude with an assessment of what these findings suggest about
the reformulation of the cryptographic imaginary.
168
The Fragmentation of the Cypherpunk Community
Most accounts of the cypherpunks’ activities – both in literature
36
and the accounts of
community members – end roughly around the early 2000s, with the conclusion of the Bernstein
case and the events that followed September 11
th
. But what happened to the cypherpunks after the
dissolution of the listserv and end of the Crypto Wars? Though this is a subject deserving of deeper
archival study, what follows is a (fragmentary and incomplete) account pieced together from the
stories I gathered in my fieldwork as well as sourcing from popular press accounts.
37
A primary reason for the fragmentation of the cypherpunk community was the dissolution
of the listserv: as mentioned in the previous chapter, the question of moderating the listserv grew
into a heated debate among its members. Some cypherpunks grew increasingly frustrated with the
volume of spam messages over the listserv and wanted to use filters, while others, including the
listserv administrator, John Gilmore, viewed spam filtering as a form of censorship. When Gilmore
finally decided in 1996 to begin moderating the listserv, he received so many abusive messages
that he decided he would suspend it on his toad.com server (Frauenfelder, 1997). In response,
members of the community migrated to Usenet, setting up a new group called alt.cypherpunks.
Other cypherpunks who objected to Usenet set up their own distributed listserv that would be
36
See, for example, Levy (2001), Greenberg (2012) and Rid (2016).
37
I hope to provide a more complete and definitive account in future work by interviewing the original
cypherpunks. I determined that it would be most useful for the purposes of this study to first focus on
archival research drawing on the listserv discussions. This decision was reinforced in my field work by
emerging divides between the younger crypto community members I engaged in my participation and
older members of the cypherpunk community, particularly with regards to resistance by these older
members to the implementation of codes of conduct and confronting abuse – indeed, John Draper
(Captain Crunch), one of the early active members of the listserv, was accused of repeated instances of
abuse at hacker conferences, one of which involved a community member frequently present during my
field work. I believe that the relationships that I built within the present-day crypto community would
have been challenged by conducting these interviews during the time period of my engagement in the
field. With time and distance, I think they will be feasible.
169
“censorship resistant”, thus enabling subscribers to re-subscribe if they are kicked off by a
moderator. A third cypherpunk, Lance Cottrell, created a gateway to unify the separate discussions.
Unsurprisingly, this scattershot approach led the discussions among cypherpunks to slowly
disintegrate. By 2001, Gilmore wrote to the listserv that he would shut down the list once and for
all, saying “The cypherpunks list degenerated a long time ago to the point where I have no idea
why more than 500 people are still receiving it every day” (Rodger, 2001). Another possible reason
for the disintegration of the group was, ironically, the close of the crypto wars. Though the
Bernstein ruling was somewhat inconclusive in establishing a definitive right to use cryptography
in the public sector, the cypherpunks deemed it a victory – the war was won, and absent this reason
for banding together, they dispersed.
This proved to be inopportune timing: following the attacks of September 11
th
, 2001, the
US government substantially consolidated its surveillance powers and the National Security
Agency began making large investments in its ability to collect and analyze phone and internet
data.
38
The NSA had a reputation for being relatively isolated in the US government intelligence
apparatus, siloed off both geographically and culturally behind a “triple fence” (Aid, 2008). It
became clear in the weeks and months following September 11th that this communicative failure
had significant consequences for America’s inability to prevent the attack – the catastrophe
resulted not from a lack of information but a failure to communicate effectively in ways that would
enable analysts to piece that information together (Clarke, 2004; Harris, 2011).
The realization of this failure led directly to organizational changes throughout the
intelligence apparatus at multiple levels. At the national level, the NSA increased the openness of
38
See Aid (2009), Bamford (2008), Schneier (2014), and Harris (2015) for further detail on the expansion
of bulk surveillance capabilities following 9/11.
170
its communication with the rest of the military bureaucracy, and developed programs like PRISM
that facilitated information sharing with 23 other agencies throughout the government. It
strengthened global partnerships with allies including the UK, Canada, Australia and New Zealand,
who were also given access to the information. It also built partnerships with technology
companies.
By the early 2000s, the NSA had lost its monopoly on signals intelligence; large technology
companies had similar capabilities for surveillance at scale by monitoring and analyzing internet
traffic. The NSA established data-sharing agreements with large technology companies, while
simultaneously developing tactics to tap into the fiber optic cables connecting corporate data
centers. These shifts in both organizational structure and culture contributed to the development
of a global networked surveillance state – the exact evolution that the cypherpunks most feared.
Law enforcement officials were also quick to draw a connection between encrypted
messaging and terrorism, seeking to strengthen regulation of cryptography. Phil Zimmermann
himself told the New York Times he would be surprised if the hijackers were not using PGP to
protect their e-mail discussions: “I just assumed somebody planning something so diabolical
would want to hide their activities using encryption” he said (Schwartz, 2001). A Washington Post
reporter published an interview with Zimmermann where he said he received hate e-mails accusing
him of having blood on his hands (Cha, 2001). But Zimmermann quickly responded with a strong
reassertion: “Did I re-examine my principles in the wake of this tragedy? Of course I did. But the
outcome of this re-examination was the same as it was during the years of public debate, that strong
cryptography does more good for a democratic society than harm, even if it can be used by
terrorists” (Zimmermann, 2001). Viewpoints like Zimmermann’s were difficult to express publicly
in the weeks and months after the attacks, however, as the United States turned its attention to the
171
interests of national security and the War on Terror. As has happened so many times over the
history of cryptography, with the onset of military conflict encryption was once again enfolded in
to the national security interests of the state.
But, in contrast to prior instances such as World Wars I, II, and the Cold War, this has not
been accompanied by a monopolization of cryptography by intelligence agencies. Encryption
remains in wide use by individuals and corporations, and if anything has grown substantially in
the last few years. This led to a heightening of tensions over the proper use of cryptography in
what law enforcement officials have called the “going dark” debate, evoking agents’ diminishing
access to citizens’ communications because of the use of encryption. These tensions eventually
boiled over into an outright policy conflict, reviving the policy battles of the 1990s in what many
are now calling the Crypto Wars 2.0. In time, law enforcement demands were met with resistance
by advocacy organizations seeking to prevent the passage of laws seeking to introduce back doors
into encryption technologies.
39
The Cypherpunks Lose the War, and Start a New One
But in the mid-2000s, in an environment where national security was paramount, the
cypherpunks’ ideas were no longer in vogue: ‘cypherpunk’ served more as an identity marker
claimed by those who maintained a devotion to the ideas of crypto-anarchy than to denote a
39
In contrast to the first Crypto Wars, these policy battles are now global: governments in the US, UK,
France, Netherlands, and Australia, among others, have sought, and in some cases, successfully
implemented, mandates that backdoors be introduced into encryption technologies. See: Kang, C. (2016,
May 8). Police and Tech Giants Wrangle Over Encryption on Capitol Hill. New York Times.; Price, R.
(2015, July 1). David Cameron is going to try and ban encryption in Britain. Business Insider.; Moody, G.
(2015, Jul. 3). New Dutch law would allow bulk surveillance, compelled decryption. Ars Technica UK.;
Mathews, D. (2015, May 20). Defence laws threaten to shut down Aussie encryption. ITNews.
172
community working to build cryptographic code. Those that held on to cypherpunk ideals were in
a depressed state, as reflected by this lament by Frank Rieger, a German hacker, before the 2005
Chaos Computer Congress:
We lost the war for privacy, we lost the war for free internet maybe, we lost the war against
the surveillance industry, at least for now. So, we won a few battles, we prevented the
Clipper Chip quite a while ago, we managed to keep Germany at least an island of privacy
in Europe, but that's about it. And we think it's about time that we talk about this, that we
think about why we have lost, what can be done, what is the situation now and what follows
from that, based on that we have some kind of foresight on where the travel goes, where
we'll end up and what the technology is that will be used to keep an eye on us…Why did
we lose? …we had no plan B for September 11th. The very moment those planes crashed
in to the World Trade Center we had already lost, there was no way around it (Gonggrijp
& Rieger, 2005)
Joined on stage by Dutch hacker Rop Gonggrijp, Rieger emphasized the need to focus on how to
“maintain breathing space in society” – to help those who had devoted their lives to making change,
rather than solely building technology. This message was somewhat ahistorical: as the last few
chapters have illustrated, the crypto community has always been centrally concerned with activist
work and the protection of the right to encrypt through legal battles, despite rhetorical appeals that
center technical solutions.
Rieger and Gonggrijp were not alone in making this call: scholars and activists alike joined
them in the observation that nation-states were learning ways to use the internet for control. In
2002, researchers at the University of Toronto’s Citizen Lab, Harvard Law School’s Berkman
Center for Internet and Society, Oxford University’s Oxford Internet Institute and Advanced
173
Network Research Group began a cross-disciplinary effort to study and document internet filtering
and surveillance practices being deployed by countries around the world, concluding that the
number of countries using internet filters grew from three countries in 2002 to 25 countries in 2007
(Mahmud, 2007). Groups of technology activists began to band together to develop technical
solutions that would aid in the effort. Two projects in particular built on cypherpunk ideas – as one
cryptographer relayed it to me in an account of the history of the cypherpunks, in the mid-2000s
“some of the people who participated in the original cypherpunk movement actually started to
succeed technically, and we got things like cryptographic primitives, Tor and the Signal protocol
which actually work” (Interview, August 29, 2017).
Tor
Tor is an anonymous browser that masks the IP addresses of its users by routing their
internet traffic on several ‘hops’ through servers. Like the anonymous remailers created by the
cypherpunks of the 1990s, Tor builds on David Chaum’s ideas about mix networks through a
concept called ‘onion routing’ developed by several US Naval Research Laboratory (NRL)
employees. Built by a group of MIT computer scientists, Tor was developed at an uneasy nexus
of US military research and technology activism.
40
The project received funding from the NRL
and US State Department, which was increasingly interested in supporting civil society-oriented
40
How closely Tor is associated to the US government has been a subject of contestation over the years.
For years, the Tor Project’s website framed the project’s purpose as “protecting government
communications”, and two of the three the creators of the browser had ties to NRL during the time the
developed the source code. This has formed the basis of claims made by critics that Tor should not be
trusted. Tor is now an open-source project developed by a diverse group of developers, an important
quality for building trust in the security of its code. The project has also since made efforts to diversity its
funding. For an in-depth history and analysis of the Tor project, see Maréchal (forthcoming).
174
internet projects under an emerging rubric of internet freedom. As the Bush Administration
established the cultivation of democratic institutions around the world as a strategic priority,
members of the State Department focused on bolstering civil society organizations and the freedom
of the press in parts of the world that were under authoritarian control. Promoting an open internet
and the free flow of information was articulated as a key part of this mission: as Shawn Powers
and Michael Jablonski describe it, the internet became “not only the object of struggle but…critical
infrastructure for pursuing larger geopolitical goals” (Powers & Jablonski, 2015, 10).
This interest further expanded under the Obama Administration, gaining visibility during
Hillary Clinton’s tenure as Secretary of State under a new policy doctrine known as ‘21
st
Century
Statecraft’, linking American interests to the leveraging of the ‘networks, technologies, and
demographics of our interconnected world (Clinton, n.d.). At a speech at the Newseum, Clinton
announced the formation of a Global Internet Freedom Task Force, saying:
On their own, new technologies do not take sides in the struggle for freedom and
progress, but the United States does. We stand for a single internet where all of humanity
has equal access to knowledge and ideas. And we recognize that the world’s information
infrastructure will become what we and others make of it (Clinton, 2010).
Cypherpunk ideas played an important role in laying the groundwork for a conceptualization
grounded in a negative conception of freedom – freedom from government involvement (Hellegren,
2017). The State Department thus focused on a doctrine of the ‘free flow of information’ –
normalizing information as a commodity in the interest of expanding markets for American firms’
content, software and hardware (Powers & Jablonski, 2015). Funding digital rights projects such
as Tor is directly linked to this mission.
175
Wikileaks
Though in some respects post-cypherpunk projects aligned with US government internet
freedom policy goals, on other fronts, former cypherpunks engaged in overt battle with the US
government. In 2006 Julian Assange, perhaps the most visible self-identified cypherpunk of the
21
st
century, established Wikileaks, a non-profit organization devoted to the publication of leaked
information.
Wikileaks could be seen as the realization of a vision put forward by Timothy May over
the cypherpunk listserv over a decade earlier. In a post titled “Introduction to BlackNet”, May
asserted “BlackNet is in the business of buying, selling, trading and otherwise dealing with
*information* in all its many forms,” May said:
We buy and sell information using public key cryptosystems with essentially perfect
security for our customers. Unless you tell us who you are (please don’t!) or inadvertently
reveal information which provides clues, we have no way of identifying you, nor you us”
(Cypherpunk listserv, August 17, 1993).
The concept of the BlackNet was particularly amenable to a trade in state secrets, encouraging
whistleblowers in government to adopt anonymity to render government more transparent through
strategic leaks.
Wikileaks adopted a similar stance, asserting:
Wikileaks will be the outlet for every government official, every bureaucrat, every
corporate worker, who becomes privy to embarrassing information which the institution
wants to hide but the public needs to know. What conscience cannot contain, and
institutional secrecy unjustly conceals, Wikileaks can broadcast to the world.
176
Wikileaks will be a forum for the ethical defection of unaccountable and abusive power to
the people. (Wikileaks, 2007).
At its founding, the Wikileaks website asserted the project would focus on leaks relating to
authoritarian governments in Asia, the former Soviet bloc, Sub-Saharan Africa and the Middle
East – but over time, it shifted focus toward the US government, and specifically US military
operations, likely spurred on by the leaks of documents by Chelsea Manning about the War on
Terror.
These dual trajectories for cypherpunk ideas are explained by contradictions within US
government policy in the 2000s. These projects were responses, each in their own way, to evolving
tensions within US government agencies, some of which see digital rights technologies as critical
for the promotion of human rights and democracy, while others see them as harmful to national
security interests (Carr; 2016; McCarthy, 2010; Marechal, forthcoming). Tor and Wikileaks both
have complex relationships with US government interests, evolving in ways that challenge state
sovereignty while at times receiving benefits such as funding, leaked information or vocal support
from political allies.
These projects are also examples of new forms of organization that grew to supplant the
cypherpunk community: where in the 1990s, cypherpunks largely created projects on their own or
in ephemeral groups and partnerships, in the 2000s these new projects adopted a more persistent
organizational form, helped along by former cypherpunks. Spurred on by growing threats to
privacy and free expression online, the emergence of these more formalized project enabled new
people who shared the values of crypto-anarchy – and some who did not – to join and contribute
to these projects through advocacy and code production.
177
Networks of crypto advocates
By the late 2000s, a community made up of formal and informal cryptography- and
privacy-focused organizations began to coalesce, supported by a combination of government-
linked and private funding. Geographically dispersed, members of these organizations began to
gather annually at conferences to discuss and share their work. Much of my field work took place
at these gatherings, which were located in North America and Europe. My interviews took place
at the Internet Freedom Festival, a gathering held annually beginning in the late 2000s,
41
and the
Chaos Communication Congress, an annual conference organized by the German hacker group
Chaos Computer Club, held annually in Germany beginning in 1984. I also draw on participant
information data drawn from RightsCon and the Crypto Summit, a series of policy-focused
summits centered on human rights and technology, organized annually by the advocacy group
Access Now since in 2011.
Each of these conferences cultivates a unique space – I found the CCC and IFF to be
especially generative for this sort of research, because they created spaces of varying formality for
participants to gather and have conversations outside of the conference setting. For example, the
CCC allows groups to apply for assembly spaces distributed throughout the venue, where they can
set up tables to advertise their work, hack on projects together and commiserate about common
interests. These spaces featured often elaborate designs representing the aesthetic of their
organizers, and were populated much of the day and late into the night by formal and informal
community members. Given the long hours many attendees will spend in the space, different
organizations within the CCC also host tea rooms with floor pillows that participants can relax and
41
The initial festivals were smaller and shifted geographic locations year to year; in their original
instantiation they were known as Circumvention Tech Summits.
178
sleep on, and one floor of the conference center was dedicated to participants with young children,
including games designed to teach them principles of hacking and creative play with technology.
The IFF is somewhat less engineered, but the conference space features two outdoor atriums where
participants gather over coffee and (often) vegan food provided by local food truck vendors. Both
conferences also leave space for self-organized sessions that participants can sign up for on a
conference wiki page before and throughout the conference. RightsCon was more formal by
comparison; it was hosted in a more conventional conference space with a single atrium for
participants to gather in between sessions and a café where they could purchase food. It did not
offer opportunities for participants to create their own sessions on the sidelines.
Another notable distinction between these communities is how they approached naming
participants. The question of naming is a particularly politically sensitive one within the crypto
community, and became all the more so during the time I was present. Historically, some members
of the community have resisted using their legal name for various reasons – it may not accurately
describe their gender identity, or may put them at risk of exposure given political sensitivities
around the work of digital activism. But, over the course of the years I conducted my field work,
it was clear that sensitivities around anonymity can be used to cultivate a sense of paranoia and
fear about the law that served, at times, to center certain members of the community as the most
‘radical’ and willing to transgress legal boundaries. For example, one organization required all its
members to adopt pseudonyms in their work, ostensibly to protect their identities. However, this
practice meant that when members of the organization were abused by the group’s director and
eventually left, survivors had trouble connecting to one another, while other members still in the
organization were unaware of what happened and lacked the resources to uncover the abuse.
179
Resisting the practice of naming is thus not always an act of sensitivity, it can be a mechanism for
exerting control over others.
The conference spaces took varying approaches to identifying participants by name.
RightsCon was the most traditional, giving each participant a name tag with their legal name. The
CCC had no name tags, instead using wristbands to identify participants with their tickets. The
wristbands were designed so they could not be removed and shared with another participant. In
sessions, the speaker would be identified with a name, but frequently this was a pseudonym rather
than a legal name. The IFF used name tags, but asked participants to tell them what name they
would like to use to identify themselves at the conference. Some participants used pseudonyms,
while others used legal names. Wearing a name tag was compulsory at the conference in order to
ensure that unregistered participants did not circulate in the premises; this was meant to protect
the security of participants from a number of potential threats: state actors aiming to infiltrate
activist organizations, abusers, individuals who had previously been banned for Code of Conduct
violations.
Gender proved to be particularly salient to the question of naming at IFF, particularly in
the last year of my fieldwork. That year, the name tags featured a line where participants were
asked to write out their preferred pronoun. In a session early on in the conference, a trans
participant urged cis participants to all write their preferred pronouns so that it wasn’t just ‘people
like them’ that had to do so. I later attended a session on queering the internet freedom space where
other trans and genderqueer participants expressed discomfort with the space, because it
foregrounded their gender as the most important aspect of their identity.
42
After the session, I did
42
This seemed to break down into a divide between old-timers, who had been to the IFF before and were
encouraging of the practice (and often tended to be from Western countries) and newcomers, many of
whom were from the Global South, who were attending the Festival for the first time and disliked it.
180
note that many cis female participants had written in their preferred pronoun, but cis male
participants had not.
I describe these aspects of each conference in detail because they depict the distinct
character of each of these communities, and how things like the layout of a space, organization of
a conference session, or how participants are represented can themselves reflect the intentions of
organizers and community members. These details are indicative of community values, and at
times became the focus of broader conflicts about inclusion and respect.
The Snowden Effect
My fieldwork began shortly after Edward Snowden’s revelations about NSA surveillance,
which were a catalyzing moment for the crypto community. The revelations validated the work of
members of the community who had long warned about the likelihood of government surveillance,
and drove many new members to join the fold. Many members of the community see the
publication of the revelations as an inflection point, after which the crypto community saw an
influx of people, energy, and funding to work on projects. As one technologist described it to me,
“People who had twenty minutes ago been considered irrationally paranoid over the course of one
publication became insufficiently prepared. The same perspectives, practices, approaches and
beliefs were all of a sudden validated” (Interview, March 7, 2017). Another told me “Post-
Snowden there’s been a huge wave of interest from the developer community…suddenly a lot of
people want to contribute” (Interview, March 8, 2017).
The revelations also diversified the community of people working on cryptographic
projects. The developers I spoke to who had joined the community after the closure of the
181
cypherpunk list, but before Snowden, often had similar backgrounds that bridged activism and
working on Free and Open Source Software: for example, one technologist told me he’d been
engaged in anti-war protesting after September 11
th
and developed concerns about privacy as a
result of those experiences, but only began working on cryptography after starting to work on
Debian, an open source software project. As he described it:
To participate in Debian you had to have a GPG key and that made me start to think about
networked communications. At that point computers were not isolated systems. The idea
that I can control my software is meaningful on its own, but your data is traveling off your
device, so how is it traveling off your device? So crypto came from that, that’s where my
interest in the crypto came from. (Interview, March 10, 2017).
Another worked as a sysadmin and began to feel “creeped out” by how much she saw data being
linked. “It was a moment when I installed a paranoid plugin in my life”, she told me. Up until then
“My activism was very much about boring things” (Interview, March 8, 2017). After Snowden,
members of other communities not traditionally associated with privacy or encryption began to
join: the Indymedia community was one common entry point, as were other forms of media
activism and social justice organizing.
The revelations also encouraged a sense of political consciousness among academic
cryptographers. In a 2015 article titled “The Moral Character of Cryptographic Work”, the well-
regarded cryptographer Philip Rogaway admonished members of the field for abandoning the
importance of applied cryptography. “If cryptography’s most basic aim is to enable secure
communications, how could it not be a colossal failure of our field when ordinary people lack even
a modicum of communication privacy when interacting electronically?”, he wrote (Rogaway, 2015,
1). Though cryptographers were not themselves responsible for, as he put it, “turn[ing] the Internet
182
into an instrument of total surveillance”, they did have the capacity to help (10). He called for
academic cryptographers to be more in dialogue with the work of the cypherpunks, who had long
been engaged in implementing systems:
Cypherpunk cryptography has been described as crypto with an attitude. But it is much
more than that, for, more than anything else, what the cypherpunks wanted was crypto with
values. And values, deeply felt and deeply embedded into our work, is what the
cryptographic community needs most. And perhaps a dose of that cypherpunk verve
(Rogaway, 2015, 46-7).
The growth of the community led to shifting priorities for cryptographic work. At the 2015
CCC, Rieger and Gonggrijp reprised their talk, “Ten Years After We Lost the War”. They asserted
that it was “a quite naïve belief” to think that if only people knew the secrets of government
corruption, they would work together to change the system. Instead, they said:
A sense of belonging to a community is going to become very important… If we want to
form further alliances we have to stop the trend that people are choosing minor bickering
fights, fighting people in their own ranks who aren’t using the words that they like, aren’t
finding different stuff important. Having the same goals is what makes us a culture, having
the same ideas about what a society should be. We should actively try to stop fighting each
other. We are not each others’ enemy. (Gonggrijp and Rieger, 2015).
Their talk challenged the dominant focus on technology within the community at CCC,
something that not all in the audience agreed with. Moreover, building a community culture around
like-minded goals would require more than just an end to in-fighting among its members: to be
successful, the community would need to attend to how difference is experienced among its
183
members by focusing on the building of coalitions across divides of nationality, ideology, and
levels of technical expertise. I will return to this dynamic later in the chapter.
Usability, Herd Immunity and an Ethic of Care
Initially, however, the shift in priorities was reflected in a refocusing among community
members on the usability of cryptographic software. Usability has been a persistent issue
throughout this history of encryption technologies: in chapter three, I recounted how RSA
determined that most of its users did not necessarily need to know how encryption worked or
whether it was being used, while Phil Zimmermann’s PGP was notoriously challenging, attracting
users in the free and open source community who enjoy tinkering with complex technical systems.
The Snowden revelations raised the stakes by demonstrating the need for everyone to use these
tools, not only those at greatest risk. Unexpectedly, Facebook, owner of the messaging service
WhatsApp, was one of the first to respond by turning on end-to-end encryption using the Signal
protocol by default for a billion WhatsApp users. WhatsApp’s implementation was designed for
users with lower technical sophistication – which meant it was less secure than the open-source
alternative, Signal (Evans, 2017).
This raised questions for community members about the trade-offs associated with
improving the usability of encryption software. The central challenge they grappled with was how
to effectively communicate security principles to users who may not have technical expertise: is it
better to simplify a system and risk a user thinking they are more safe than they are? Or to introduce
more menu options that enable them to tailor the system to their own needs, messaging points of
failure through technical complexity that risk alienating the less technically savvy?
184
I saw an uneasiness about the answers to these questions reflected in many of the technical
sessions I attended. As one technologist described it to me,
We’ve been focused on ensuring that we don’t give anyone a false sense of security. When
we tell someone its protected we want to make sure its protected against all conceivable
attackers. The result of this is you require a lot more of the user. When the user doesn’t
have to make every decision you have to have someone make the decisions for you. You
have to know the answers to questions, have to know the state of the world, nature of
protocols, etc. [You] need to expose users to this so they know the range of adversaries.
(Interview, March 10, 2017).
He asserted that this approach had essentially resulted in two basic options: either plaintext e-mail,
used by the majority of people, or super-secure email for nerds: “I think the encrypted mail
community has basically failed for 30-35 years. We’ve basically failed to create usably secure
encrypted mail.”
The technologist was active in working on a program that would use opportunistic
encryption for e-mail messages – implementing encryption that would provide basic protections
to more users that would, as he put it, not touch the ceiling but would raise the floor. Attending a
meeting where he presented the work in progress, I witnessed the hesitancy among some
participants over this approach, who expressed concerns about creating technologies that
overpromised security to users. As he told me later, he thought these concerns were:
…coming from a legitimate place. We really do care that these tools provide people with
the security those people expect. At the same time, the framing of it says it’s either 100%
185
secure against all attackers or you might as well not bother…we don’t want to force people
to default to plaintext (Interview, March 10, 2017).
Another person I interviewed who described herself as a usability expert described a
different set of challenges. She worked as a consultant on projects that were designed for the most
dangerous cases; users who worked in human rights in parts of the world where they faced
particular danger for their work. She was told by her boss on the first day of a new job “You can’t
talk to any of these users, you’ll kill them” (Interview, March 6, 2017) by attempting to reach out
to them directly. One solution she proffered was to put technologists more in dialogue with people
in the advocacy space, noting that many of the policy-oriented sessions at conferences were
attended only by advocacy workers, and technical sessions were only attended by technologists.
The opportunistic approach taken by the developers who worked on WhatsApp was
ultimately seen as positive by most of those I spoke with. As the usability expert put it, “My
personal take is that harm reduction is a useful tactic: the sooner we can provide herd immunity,
that’s a good thing”, referencing a frequently used analogy that widespread use of encryption can
act as a vaccine, protecting those who most need it by ensuring that their encrypted web traffic
does not stand out as unusual within a population.
These conversations about usability at times resulted in broader critiques of the prevalence
of libertarian thought within technical communities. As one technologist put it,
I think we’ve…done a terrible job within the cypherpunk community saying why should
you care about infosec or cyber. There’s this sort of techno-libertarian viewpoint that you
should protect yourself against this to protect your data, a rational liberal view of you
186
should care about yourself. But you should care because you have a responsibility to the
other people whose data I hold. (Interview, March 10, 2017).
Several participants sought to reframe the impetus to build encryption technologies around an ethic
of care. In contrast to the crypto-anarchic view that placed the onus for privacy protection on
individuals, they saw privacy as a networked phenomenon (Marwick & boyd, 2014). For them,
using encryption was not only a means of protecting oneself, but looking out for the privacy of
others in their social network. This was felt particularly deeply at the Internet Freedom Festival,
at which community members would often find ways to memorialize the absence of activists who
had been jailed for their work, through posters, protests, and speeches in their honor. These
memorials served as a reminder of the high stakes involved in digital rights work for those in the
community who lived in repressive contexts, driving home the importance of the work at hand.
Communities and Codes of Conduct
In her forthcoming book Hacking Community, Christina Dunbar-Hester argues that
diversity initiatives within technical communities aim to recast the ‘social infrastructure’ that
determines how they form and proceed by building infrastructures of care: recreating the norms
and values that form the basis for association within their communities by “hacking” community
practices (Dunbar-Hester, forthcoming). In the context of the crypto community, doing so meant
addressing the effects of meritocracy cultivated by the historical dominance of libertarian thought
in the community. As one policy advocate described it to me,
The libertarian view, the cypherpunk ‘tech is going to fix all of our privacy issues,’ as a
community that was not going to fly anymore…we realized that because this community
187
was totally hostile to women, anyone who wasn’t a white cisgender male, and that we had
to be proactive. This civil libertarian framework was insufficient for creating a space that
was generally going to help people (Interview, March 9, 2017).
The critique embedded in this quote addresses the unhealthy impact meritocracy had on the
community: it enabled some members to get by on their technical merits while placing themselves
in positions of power and control that were used to mask and justify abusive conduct – long a
challenge faced within technical communities (Dunbar-Hester, forthcoming; Oldenziel, 1999;
Marvin, 1988).
Re-centering the users was an important element of the project to resist meritocracy, but
was only part of a broader mission to ‘queer the internet freedom community’, as the organizers
of one session described it. Creating more inclusive spaces took more than simply willpower. As
one technologist put it to me,
I don’t know if you’ve noticed, but there’s a really piss poor representation of people of
color in this space. There needs to be more access, because I would not be here without
amazing mentorship. I have never felt like an affirmative access case because I’m good at
my shit, but [name redacted] thought it was important to take people like me under his
wing. Slowly but surely our numbers are increasing, not just black [and] white diversity,
also a lovely queerness to this space that is very beautiful. (Interview, March 10, 2017).
The technologist described the importance of mentorship and resources to cultivating a more
inclusive community but, notably, validated her own presence in claims to technical expertise. The
problem of resources and persistence of technical meritocracy was made even more visible in a
project where several community members sought to map out the internet freedom community,
188
using post-its and markers to create a representation that would make visible the power
discrepancies and point to areas where they could be addressed. I worked alongside the DC-based
CEO of a funding organization and two women from advocacy organizations on the African
content. Looking back at our map, we realized that organizations from the Global South were
visibly on the margins, excluded from funding, technical resources and networks of people – some
didn’t even fit onto the white space of the page.
The resource gap was one of many challenges to inclusion: another was the cultivation of
safe spaces for people who identify as women or gender non-conforming, who represent non-
Western countries, different abilities and levels of technical expertise. The organizers of the
Internet Freedom Festival particularly took this on as their mission, drawing on frameworks from
social justice and community organizing and incorporating them into the physical space.
As Sandy Ordoñez, the festival’s co-founder and strategic advisor, told me,
We’re not 100% masters at it, but we try and use visual stuff – the fact that there’s two
courtyards and you’re in the sun. Or the fact that there’s color – I use glitter a lot, also
because of the psychological effect, because you can’t take yourself too seriously when
you have it (Interview, May 31, 2017).
The Festival’s glitter table was a staple ingredient in the culture cultivated around the event: by
the last year of my fieldwork, the organizers even found sponsorship from an eco-friendly glitter
company to supply one kilogram of biodegradable glitter to the festival.
43
The use of color and
43
In the last year of my field work, the Festival instituted a consent policy for the use of glitter: in the
year prior, it was not unusual for individuals to throw clouds of glitter at one another, a particular issue at
the closing night party. The policy became an object lesson in the value of consent as a concept with
broader applications than just in relation to sexual behavior.
189
sparkles was explicitly designed to counteract the all-black aesthetic frequently adopted at many
hacker events, which can be intimidating, Ordoñez told me:
I think that certain tech spaces can feel very intimidating for a whole range of women. They
can be hyper masculine and aggro, and they have none of the visual markers that would
make you feel welcome. I grew up in an incredibly rough neighborhood – to the point that
I was held up several times at my house at gun point - and even despite that, when I first
went to my first hacker space I felt intimidated. I always imagined what it must be like if
you’re a women coming from a more traditional culture, and you’re coming into those kind
of spaces. (Interview, May 31, 2017).
Listening to her comments, I recalled my experience as a newcomer to the CCC, which, despite
making efforts at inclusion such as the provision of gender-neutral bathrooms, nevertheless felt
challenging to navigate as a newcomer.
The kinds of concerns that Ordoñez emphasized were not solely voiced by female
participants – they were shared by men who visibly expressed a commitment to allyship. When I
asked him what he saw as the challenges he faced in his work, a male technologist who worked on
a longstanding post-cypherpunk project told me “One of the serious problems that I see right now
is that we are 6 of 7 white cis males from global north and the last individual is a Japanese cis male.
There’s no diversity. It’s of course a serious problem” (Interview, March 8, 2017). He and several
others I interviewed were at the festival to be more directly in dialogue with users from non-
Western contexts.
190
Codes of Conduct
Though everyone I spoke to expressed a commitment to inclusion, they often disagreed as
to the right way to go about building a more inclusive space. A particular center of controversy
was the implementation and enforcement of codes of conduct, a mechanism aimed at creating
structures and modes of accountability to prevent harassment and abuse.
44
While some community
members viewed codes of conduct as necessary and desirable, others resisted them as exclusionary.
As one EU-based funder told me:
Things like codes of conduct are important for the scene but tend toward a US frame, like
diversity rather than an enforced policy, it’s something that we just live. Don’t use it as a
weapon to exclude people that you feel uncomfortable having around. I don’t think that’s
a healthy future – you don’t accept deep level diversity of people. It’s better to stay in
communication; not talking to people isn’t a way forward. (Interview, March 8, 2017).
The funder saw the use of a Code of Conduct primarily as a mechanism for removing people from
the community and reducing diversity of thought within it.
This resistance was particularly visible at the 2017 CCC. The conference organizers
expressed a commitment to welcome and include women and members of unrepresented groups,
instituting a mentorship program to help new participants to get to know the community. But they
resisted taking actions that they believed would infringe on the autonomy of the sub-groups that
made up the CCC. This crystallized when organizers allowed one group to declare its assembly a
“Code of Conduct free zone”, expressly rejecting any forms of behavioral constraint intended to
44
I focus fairly narrowly on one example of a controversy over codes of conduct within the crypto
community; for a more extensive treatment of this issue encompassing other tech communities see
Dunbar-Hester (forthcoming).
191
prevent abuse: “We have proclaimed the CoC-free zone because we believe that we can reasonably
relate to each other even without behavioral codes or moral guardians”, the group announced in
Twitter.
This was seen as a particularly incendiary move in a year that followed allegations of
violent sexual abuse, some instances of which took place at previous iterations of the CCC – and
the rejection by conference organizers of all submitted talks on the subject of harassment was seen
as a purposeful stance by many community members who planned on attending. This was further
problematized when a community member reported they were notified on Christmas Eve, days
before the start of the Congress, that a person who had assaulted them would be allowed to attend.
The community member reported the assault in August, but despite providing documentation
including a hospital report and photographs, the organizers said they did not know what really
happened – though the organizers did not explain this specifically, the community member
interpreted the response as being because the police declined to prosecute the case.
This was seen as a step too far for some community members, several of whom decided to
boycott the event. Weeks before, two community members had formed a new organization, Protect
Our Spaces, calling others to join together in refusal to attend any event with known abusers in
attendance. In an open letter, the founders of the group wrote:
It is no longer enough to speak out against this sort of violence against women and
communities at risk: we must isolate and control for factors that allow this to happen. We
will create spaces that exemplify the values and rights we fight for in our work.
We find fault not only with the perpetrators but also with the organizers who knowingly
invite these individuals to their events. When known abusers are allowed to attend and take
192
the stage at events, the message it sends to survivors is clear: Their work is more important
than our safety.
We refuse to attend any such event that sustains a culture of fear and fails to prioritize the
safety of women and high-risk communities. (Protect Our Spaces, 2017).
Navigating harassment and abuse is particularly complicated given the resistance to
hierarchy and, in some cases, espousal of anarchy within the community. Though many
community members express a commitment to the value of inclusion, they express this
commitment in a variety of ways. Over the course of the conflict over the CCC’s code of conduct,
several group members made links to Jo Freeman’s concept of a “tyranny of structurelessness” –
that the absence of organizational structure can create room for abuses of power that are all the
more pernicious because they are invisible. In tracing out connections between non-hierarchical
spaces, some of these critiques leveraged the claim that the ideas of crypto-anarchy created
conditions that facilitated the abuse. In contrast to the ‘crypto-anarchy in practice’ model adopted
to resolve conflicting views of the community in the 1990s, here community members opted to
create new, safe spaces where they can build new norms of engagement that prevent abuse from
happening.
Though my research on these issues is ongoing, my findings thus far demonstrate that these
community members are engaged in a process of agonistic negotiation over the ideals, governance
norms, and organizational structures that are unique to this space, experimenting not only with
codes of conduct, but restorative justice frameworks, boycotts, and the formation of support
networks or unions to develop a new way of being as a community.
193
Conclusion
What do these changes to the crypto community suggest about the evolving cultural value
of cryptography? Though this is only one of several co-existing meanings, I see the community
work I have outlined here as engaged in the formation of a new feminist cryptographic imaginary.
This new imaginary engages with two understandings of the term intersectionality: first, by
evoking the work of Kimberlé Crenshaw it acknowledges that surveillance is not distributed
equally within society; people who are positioned on the margins may be disproportionately
impacted by the rise of surveillance capitalism. It also acknowledges the shift toward a secondary
understanding of intersectionality as a form of practice that seeks to resist these forms of
oppression by embracing difference within the community.
In contrast to crypto-anarchy, this feminist imaginary sees the work of the crypto
community to encompass an ethic of care, seeking to create safe and autonomous spaces for
individuals who are marginalized because of their ideas or identity to join together. Encryption is
an important factor in creating the conditions to make these spaces possible by creating zones
hidden from the persistent gaze of surveillance by states and corporations under the conditions of
surveillance capitalism.
Though the community members rarely conceptualized their work in abstract theoretical
terms, the imaginary draws on antecedents in feminist thought that trouble the distinctions between
notions of private and public. Historically, public and private spaces are also gendered spaces –
the public is associated with masculinity, the market and assembly, and social personality, while
the private is associated with femininity, the home and intimacy (Warner, 2002). Second wave
feminists sought to “explode the private” by bringing issues that had been relegated to the realm
of private conversations into public space: making the personal political (Mackinnon, 1988, 100).
194
Feminist historian Joan Wallach Scott has similarly argued that the politics of gender “dissolves
the distinction between public and private (1989, 26).
For the cypherpunks, privacy was largely regendered as male: it became associated with
the pursuit of autonomy through the use of encryption, exiling the embodied self in order to enable
the digital self to roam free through anonymous transactions. But this formulation of privacy is
being challenged by some in the contemporary crypto community. They are working collectively
to reformulate the imaginary around encryption in ways that challenge simple public/private
binaries to foster in-between safe spaces for collective self-expression.
Encryption can be wielded in the collective effort to challenge the dominance of technology
companies, data brokers, and state intelligence agencies. It can be used to create new forms of
collectivities much like those described by Catherine Squires as enclave publics, “signified by the
utilization of spaces and discourses that are hidden from the view of the dominant public and the
state” (Squires, 2002, 458). Squires evokes Nancy Fraser’s critique that the distinctions between
public and private are often over-simplified; rather than a single, unified public sphere there are
subaltern counterpublics, where people can create separate spaces in which to discuss their
interests without the interference or oppression of a dominant group.
In societies structured by inequalities, members of dominant groups have advantages
because they are able to set the rules for public speech (Squires, 2002). This is as much the case
within technological communities as it is in broader social system, and may be further exacerbated
by our present technological conditions – the dominant groups are not just social groups
traditionally understood, they involve the technological apparatuses that mediate our social lives
– both systems of technological surveillance and the complexities of encryption software can
contribute to reinforcing the positions of those in power. Thus, tackling inequality is not only a
195
social, political, and economic problem, but a technological problem – and, for the crypto
community, encryption is an important instrument in solving it.
196
Conclusion
Between 1967 and 2017, cryptography transformed from an instrument of control, enabling
states to protect and expand their sovereignty, to an instrument of autonomy, a necessary
precondition for the right to free expression and association in the digital age. Over the course of
this dissertation I have illustrated how this evolution took place alongside in tandem with the
development of networked infrastructure. Encryption became a nexus for negotiations by
regulators, firms, and individuals over the implications of networked technologies for society; how
connectivity would redefine the relationship between information and power in ways that had
profound consequences for privacy and security as we experience them in our everyday lives.
Responding to competing political, economic and social imperatives, cryptography alternately
served to protect the security of data, to enable new kinds of anonymous communications and
transactions, and protect individuals’ privacy against the growth of mass surveillance.
Now that I have traced out this history I can pull together several disparate threads into a
more coherent narrative. The initial question that brought me to this project was motivated by a
sense of curiosity about the origins of the surveillance state: how, despite the technical and legal
capability to incorporate privacy protections into our internet infrastructure, did we wind up with
so many technologies that surveil us?
Though to some extent this inquiry has raised as many questions as it answers, it did
illustrate a few important factors. It showed that cryptographic software had indeed advanced in
ways that could have enabled greater privacy protections, and that the conclusion of the Crypto
197
Wars of the 1990s created a policy opening to make this legally possible. But there was more at
play here than code and law: other influences worked to inhibit these uses of encryption.
One important factor was, in fact, ambiguity. The US government has never held a
consistent position with regard to the value of encryption for society. This ambivalence is not
neutral; creating uncertainties around encryption’s legal status has, in effect, inhibited adoption
over the past several decades. Though in the 1970s it seemed tenuously supportive, even
developing the (possibly compromised) Data Encryption Standard for private sector use, by the
1990s regulators were more consistently opposed. They tested out a number of different regulatory
approaches in the effort to curb the use of encryption, including export controls, voluntary and
threatened involuntary restrictions on cryptographic research, prosecution via customs
enforcement and the promotion of backdoored encryption technologies.
These varying approaches reflect an important dynamic of this history of encryption: that,
in fact, different agencies within the government hold distinct and at times contradictory positions
on its value. In the Bernstein case, the courts left an open door for regulators to take a more
supportive direction – and indeed, as Hellman outlined, there are valid reasons why regulators
would see widespread use of cryptography as necessary for the protection of US national security
interests. As American firms increasingly adopted the use of networked technologies for
communication, encryption plays an important role in protecting their interests from espionage
and interference. In recent years, NSA officials have expressed increasing levels of support for
strong encryption, while FBI officials are vehemently opposed.
The technology industry also played a role in shaping the narrow adoption of encryption
technologies: the software industry’s promotion of RSA as its preferred standard helped to
encourage use of encryption despite an onerous export controls system, but perhaps at the cost of
198
competitors. Clearly RSA’s vigorous pursuit of patent protections meant few firms were willing
to use PGP while it remained in uncertain legal territory, pushing the privacy-centered software to
the margins. And PGP’s notorious difficulty further meant that it attracted a user base interested
in tinkering around with software systems, to the exclusion of less technically savvy users.
In combination, this resulted in the widespread use of encryption for the protection of
financial transactions, enabled by the growing e-commerce industry, but not in ways that would
have protected users’ privacy. The Snowden revelations put these slow-moving developments,
which took place over the course of decades, into stark relief, making visible the flawed security
of our internet infrastructure.
Cryptography and Autonomy in the Network Society
This project also dealt with deeper questions about the cultural meaning of cryptography
in a networked society; interrogating the values and meanings we associate with encryption
technology that are so frequently the subject of contestation. I sought to understand how we as
individuals and members of communities can regain some semblance of autonomy and control
under conditions of surveillance capitalism that concentrate the ability to collect, analyze, and use
the information we produce on a daily basis in the hands of the state and corporations. Over the
course of five chapters, I have described encryption as process, artifact and infrastructure,
destabilizing the notion that has any singular meaning. I have traced through the multiple
discourses associated with cryptography, eventually introducing the notion of a cryptographic
imaginary to describe how these forms of meaning spill over to exceed the bounds of discourse,
becoming embodied in technical and organizational practices.
199
One cryptographic imaginary, crypto-anarchy, emerged over the course of the 1990s, as
amateur cryptographers developed notions that the cultural value of cryptography exceeded its
instrumental use for the protection of privacy and security. For the cypherpunks, encryption
became part of a cyberlibertarian project to achieve spaces of autonomy from institutions of the
state and society at large.
But these visions of crypto-anarchy were troubled, over time, by evolving ideas about what
autonomy truly means. In 1996, John Perry Barlow asserted in his “Declaration of the
Independence of Cyberspace”, “We are creating a world where anyone, anywhere may express his
or her beliefs, no matter how singular, without fear of being coerced into silence or conformity”.
In the Declaration, Barlow expresses an aspiration that cyberspace would make it:
very difficult for the physical world to impose sovereignty on cyberspace. Even though the
relationship of cyberspace to the physical world is exactly that of mind and body, it is
difficult for the body to locate the mind and impose anything on it. (Herkovits, 2015).
The cypherpunk ideas that swirled around the listserv could only be but imaginary so long,
as Barlow put it, “Our identities have no bodies, so, unlike you, we cannot obtain order by physical
coercion.” Though they persist in some circles, cypherpunk visions where the body hid in privacy
exile while the mind roamed free on the net are hard to reconcile in a technological environment
where our on and offline selves are inextricably fused.
These ideas are also belied by the growing number of people who have been imprisoned in
recent years for using encryption technologies: over the course of my work on this dissertation,
the Zone9 bloggers, a collective of bloggers and journalists in Ethiopia, were charged with
terrorism and sent to jail for nearly two years. Though many of the bloggers were finally freed, in
200
the spring of 2018 Zone9 blogger Befeqadu Hailu was once again detained by the Ethiopian
government and remained in prison at the time of writing. In 2017, the Istanbul 10, a group of
human rights advocates, were also jailed for attending a digital security training in Turkey. Most
of them have since been freed, with the exception of Taner Kilic, Chair of Amnesty International
Turkey. The rise in states’ use of censorship and surveillance and criminalization of encryption
tools make it increasingly clear that the use of cryptography to mask the contents of a message
does not, by any means, mean the erasure of the bodies using it.
The stakes for cryptographic work have thus been raised considerably: if the cypherpunks
once promoted the notion that technologies could exceed the force of law, the 21
st
century has been
accompanied by a vigorous reassertion of the monopoly of violence states hold over cryptographic
code and code-makers. Still, encryption is all the more necessary to the protection of human rights
under conditions of surveillance capitalism (Zuboff, 2015). As David Kaye, the United Nations
Special Rapporteur on Free Expression, has argued,
Encryption and anonymity, and the security concepts behind them, provide the privacy and
security necessary for the exercise of the right to freedom of opinion and expression in the
digital age. Such security may be essential for the exercise of other rights, including
economic rights, privacy, due process, freedom of peaceful assembly and association, and
the right to life and bodily integrity (Kaye, 2015).
Encryption creates the technological conditions under which a public sphere can thrive, despite
the ubiquity of surveillance.
The history I’ve described over the course of this project suggests that public key
cryptography is particularly well-suited to the kinds of networked communications made possible
201
by the web: it enables the sharing of encrypted information without the prior existence of a secret
channel, opening up the capacity for private communications networks. As such, it can serve as a
corrective for some of the harms networked communications infrastructures make possible –
namely, that the technologies that connect and empower us can also be used to surveil and hurt us.
Cryptography can create new spaces of possibility for communities to form in an environment of
mass surveillance; it can enable those with marginalized identities or marginalized views to create
spaces for expression and cultivate relationships with like-minded individuals.
This is especially valuable for communities that already experience oppression in the
public arena, and the absence of women and/or people of color in this narrative is by no means
irrelevant to cryptography’s thus far limited achievement of this vision. Rather, I argue, the politics
of inclusion in contemporary crypto spaces are a site of struggle to achieve the deeper potentials
that cryptography makes possible.
Though over the course of its history cryptography has traditionally been monopolized by
states, encryption is thus not inherently hierarchical and authoritarian leaning; it does not by its
nature need to serve the interests of state secrecy. But neither is it individualized or meritocratic,
as some of its proponents have claimed. It does not, of its own, grant individuals the capacity to
overcome state power. If there is anything that this project has shown, it is that autonomy is a
collective enterprise, and cryptography a valuable tool in the project to achieve it. Cryptography
is, fundamentally, about communication – building networks, forging bonds, and creating
opportunities for change.
202
References
Abbate, J. (1999). Inventing the Internet. Cambridge: MIT Press.
Abelson, H., Anderson, R., Bellovin, S. M., Benaloh, J., Blaze, M., Diffie, W., Neumann, P. G.
(2015). Keys under doormats: mandating insecurity by requiring government access to all
data and communications. Journal of Cybersecurity, 1(1), 69–79.
Adler, S.E. (2017, Jan. 1). Encryption for All: Why This American Tradition Must Be Upheld.
California Magazine. Retrieved Apr. 6, 2018 from https://alumni.berkeley.edu/california-
magazine/just-in/2017-01-31/encryption-all-why-american-tradition-must-be-upheld.
Aftergood, S. (2016). Invention Secrecy Increased in 2016. Federation of American Scientists.
Retrieved from https://fas.org/blogs/secrecy/2016/10/invention-secrecy-2016/.
Agre, P. E. (1997). Computation and Human Experience. Cambridge: Cambridge University
Press.
Aid, M. M. (2009). The Secret Sentry: The Untold History of the National Security Agency. New
York: Bloomsbury Press.
Amnesty International. (2016). Encryption: A Matter of Human Rights. Amnesty International.
Retrieved from https://www.amnestyusa.org/reports/encryption-a-matter-of-human-
rights/2/.
Ananny, M. (2015). From Noxious to Public? Tracing Ethical Dynamics of Social Media
Platform Conversions. Social Media + Society, 1(1): 1-3.
Ananny, M. (2016a). Toward an ethics of algorithms: Convening, observation, probability and
timeliness. Science, Technology, and Human Values, 41(1): 93-117.
203
Ananny, M. (2016b). How do communication technologies have politics? Learning to read
infrastructures [Powerpoint]. Retrieved from University of Southern California
Communication and Technology.
Angwin, J. (2014). Dragnet Nation: A Quest for Privacy, Security and Freedom in a World of
Relentless Surveillance. New York: Times Books.
Angwin, J., Larson, J., Mattu, S. and Kirchner, L. (2016, May 23). Machine Bias. ProPublica.
Retrieved from https://www.propublica.org/article/machine-bias-risk-assessments-in-
criminal-sentencing.
Armer, P. (1975). Computer Technology and Surveillance. Computers and People, 24(9): 8-11.
Assange, J. (2012). Cypherpunks: Freedom and the Future of the Internet. New York: OR
Books.
Bamford, J. (1982). The Puzzle Palace: A Report on America’s Most Secret Agency. Boston:
Houghton Mifflin Company.
Bamford, J. (2008). The Shadow Factory: The Ultra-Secret NSA From 9/11 to the
Eavesdropping On America. New York: Doubleday.
Barbrook, R. and Cameron, A. (1995, Sept. 1). The Californian Ideology. Mute, 1(3).
Barzilai-Nahon, K. (2008). Toward a Theory of Network Gatekeeping: A Framework for
Exploring Information Control. Journal of the American Society for Information Science
and Technology, 59 (9): 1493-1512.
Batiz-Lazo, B., Karlson, T. and Thodenius, B. (2014). The Origins of the Cashless Society: Cash
Dispensers, Direct to Account Payments ant the Development of On-Line Real-Time
Networks, C. 1965-1985. Essays in Economic & Business History, (32): 100-138.
Baym, N. (2015). Personal Connections in the Digital Age, 2nd ed. New York: Polity Press.
204
Beniger, J. (1989). The Control Revolution. Cambridge: Harvard University Press.
Benkler, Y. (2006). The Wealth of Networks: How Social Production Transforms Markets and
Freedom. New Haven: Yale University Press.
Benkler, Y. (2016). Degrees of Freedom, Dimensions of Power. Daedalus, 145(1): 18-32.
Bennett, C.J. (2008). The Privacy Advocates: Resisting the Spread of Surveillance. Cambridge:
MIT Press.
Bernstein, D. (1997). Declaration of Daniel J. Bernstein. Retrieved from
https://www.eff.org/document/declaration-daniel-j-bernstein.
Bernstein, D. (n.d.). Summary of the case status. Retrieved from
https://cr.yp.to/export/status.html.
Bidgoli, H. (2006). Handbook of Information Security, Volume 1: Key Concepts, Infrastructure,
Standards, and Protocols. Hoboken: John Wiley & Sons.
Bidzos, J. (2004). Oral History Interview With James Bidzos. Charles Babbage Institute.
Retrieved from https://conservancy.umn.edu/handle/11299/107117.
Bishop, M. (1998). Early Computer Security Papers, Part I. Computer Security Laboratory,
University of California at Davis. Retrieved from
https://csrc.nist.gov/csrc/media/publications/conference-paper/1998/10/08/proceedings-
of-the-21st-nissc-1998/documents/early-cs-papers/early-cs-papers-1970-1985.pdf.
Blanchette, J. (2012). Burdens of Proof: Cryptographic Culture and Evidence Law in the Age of
Electronic Documents. Cambridge: MIT Press.
Blaze, M. (1994). Protocol Failure in the Escrowed Encryption Standard. 2nd ACM Conference
on Computer and Communications Security.
205
Boczkowski, P., & Lievrouw, L. A. (2007). Bridging STS and Communication Studies. In E. J.
Hackett, O. Amsterdamska, M. E. Lynch & J. Wajcman (Eds.), The Handbook of Science
and Technology Studies. Cambridge, MA: The MIT Press.
Bode, R. (1978, March/April). The Greeks Had a Word For It. THINK Magazine.
Boutin, P. (2008, Sept. 1). PGP author forgives Joe Biden’s anti-privacy legislation. Gawker.
Retrieved from http://gawker.com/5043974/pgp-author-forgives-joe-bidens-anti-privacy-
legislation.
Bowker, G. C., Baker, K., Millerand, F., & Ribes, D. (2010). Toward Information Infrastructure
Studies: Ways of Knowing in a Networked Environment. In J. Hunsinger, L. Klastrup &
M. Allen (Eds.), International Handbook of Internet Research. Dordrecht: Springer
Netherlands.
Bowker, G. C., & Star, S. L. (1999). Sorting Things Out: Classification and its consequences.
Cambridge: MIT Press.
boyd, d. (2010). Social Network Sites as Networked Publics: Affordances, Dynamics, and
Implications. In Papacharissi, Z. (Ed.) Networked Self: Identity, community, and culture
on Social Network Sites. Abingdon: Routledge.
Branscomb, L.M. (1974, Apr. 4). Personal communication.
Browne, S. (2015). Dark Matters: On the surveillance of blackness. Durham: Duke University.
Brunton, F. (2018). Digital Cash: A cultural history. Princeton: Princeton University Press.
Brunton, F. and Nissenbaum, H. (2011). Vernacular resistance to data collection and analysis: A
political theory of obfuscation. First Monday, 16(5).
206
Calaway, J.C. (2003). Benjamin Franklin’s Female and Male Pseudonyms: Sex, Gender, Culture,
and Name Suppression from Boston to Philadelphia and Beyond. Honors Projects Paper
18, Illinois Wesleyan University.
Carey, J. (1985). Communication as Culture: Essays on media and society. Boston: Unwin
Hyman.
Carr, M. (2016). US Power and the Internet in International Relations: The irony of the
information age. London: Palgrave Macmillan.
Castells, M. (1996). The Rise of the Network Society. Cambridge, Mass.: Blackwell.
Cha, A.E. (2001). To Attacks’ Toll Add a Programmer’s Grief. The Washington Post.
Chaum, D. (1988). The Dining Cryptographers Problem: Unconditional Sender and Recipient
Untraceability. Journal of Cryptology, 1(1): 65-75.
Clarke, R. (2004). Against All Enemies: Inside America’s War on Terror. New York: Free Press.
Clinton, H.R. (n.d.) 21
st
Century Statecraft. U.S. State Department. Retrieved from
http://www.state.gov/secretary/rm/2010/11/152078.htm.
Clinton, H.R. (2010, Jan. 21). Remarks on Internet Freedom. US Department of State. Retrieved
from https://2009-2017.state.gov/secretary/20092013clinton/rm/2010/01/135519.htm.
Coleman, E.G. (2013). Coding Freedom: The ethics and aesthetics of hacking. Princeton:
Princeton University Press.
Coleman, G. and Golub, A. (2008). Hacker Practice: Moral genres and the cultural articulation of
liberalism. Anthropological Theory, 8(3): 255-277.
Coombs, A.W.M. (1983). The Making of Colossus. Annals of the History of Computing, 5(3):
253.
207
Corrigan-Gibbs, H. (2014, Nov. 7). Keeping Secrets. Stanford Magazine. Retrieved from
https://medium.com/@stanfordmag/keeping-secrets-84a7697bf89f.
Cowen, R.C. (1977, Dec.). New Cryptography to Protect Computer Data. Technology Review.
Couldry, N. and Hepp, A. (2016). The Mediated Construction of Reality. Cambridge: Polity
Press.
Cox, J. (2016, Jun. 24). The FBI Is Classifying Its Tor Browser Exploit Because ‘National
Security’. Motherboard. Retrieved from
https://motherboard.vice.com/en_us/article/gv5jwj/the-fbi-is-classifying-its-tor-browser-
exploit.
Dam, K.W. and Lin, H.S. (Eds). (1996). Cryptography’s Role in Securing the Information
Society. Washington, DC: National Academy Press.
Dame-Boyle, A. (2015). EFF at 25: Remembering the Case that Established Code as Speech.
Electronic Frontier Foundation. Retrieved from
https://www.eff.org/deeplinks/2015/04/remembering-case-established-code-speech.
Deibert, R. (2013). Black Code: Inside the Battle for Cyberspace. Toronto: McClelland &
Stewart.
DeNardis, L. (2009). Protocol Politics: The Globalization of Internet Governance. Cambridge:
MIT Press.
DeSeriis, M. (2015). Improper Names: Collective Pseudonyms from the Luddites to Anonymous.
Minneapolis: University of Minnesota Press.
Diffie, W. and Hellman, M.E. (1976). New Directions in Cryptography. IEEE Transactions on
Information Theory, 22(6): 644-654.
208
Diffie, W. and Landau, S. (2007). Privacy on the Line: The Politics of Wiretapping and
Encryption. Cambridge: MIT Press.
Dubois, P. (1996, Jan. 11). News Release (email).
Dunbar-Hester, C. (2014). Low Power to the People: Pirates, Protest, and Politics in FM Radio
Activism. Cambridge: MIT Press.
Dunbar-Hester, C. (forthcoming). Hacking Community. Princeton: Princeton University Press.
Dupont, Q and Cattapan, A. (2017). Engendering Alice and Bob. Backchannels. Retrieved from
http://www.4sonline.org/blog/post/engendering_alice_and_bob.
Dyson, E. (1994, Jan. 18). For Interesting People (email).
Eamon, W. (1985). From the Secrets of Nature to Public Knowledge: The Origins of the Concept
of Openness in Science. Minerva, 23 (3): 321-347.
Edwards, P. (1996). The Closed World: Computers and the politics of discourse in cold war
America. Cambridge: MIT Press.
Edwards, P. (2003). Infrastructure and modernity: Force, time, and social organization in the
history of sociotechnical systems. In T. J. Misa, P. Brey & A. Feenberg (Eds.), Modernity
and Technology. Cambridge, MA: MIT Press.
EFF. (1991, June 7). Senate Anti-Encryption Bill Withdrawn. EFFector. Retrieved from
https://www.eff.org/effector/1/7.
EFF. (1994, Oct. 8). EFF Statement on and Analysis of Digital Telephony Act. Electronic
Frontier Foundation. Retrieved from
https://groups.csail.mit.edu/mac/classes/6.805/articles/digital-telephony/eff-oct8.txt.
209
EFF. (1997). Bernstein v. U.S. Department of State. Electronic Frontier Foundation. Retrieved
from https://groups.csail.mit.edu/mac/classes/6.805/articles/crypto/eff-bernstein-
summary.html.
Ellison, K. (2008). Cryptogrammatophoria: The Romance and Novelty of Losing Readers in
Code. Eighteenth Century Fiction, 20(3): 281-305.
Ellison, K. (2017). A Cultural History of Early Modern English Cryptography Manuals.
Abingdon, Oxon: Routledge, Taylor & Francis Group.
Elmer-DeWitt, P.J. (1993, Dec. 6). First nation in cyberspace. TIME.
Espiner, T. (2010, Oct. 26). GCHQ pioneers on birth of public key crypto. ZDNet. Retrieved
from http://www.zdnet.com/article/gchq-pioneers-on-birth-of-public-key-crypto/.
Eubanks, V. (2017). Automating Inequality: How High-Tech Tools Profile, Police, and Punish
the Poor. New York: St. Martin’s Press.
Evans, J. (2017, Jan. 22). WhatsApp, Signal, and dangerously ignorant journalism. TechCrunch.
Retrieved from https://techcrunch.com/2017/01/22/whatsapp-signal-and-dangerously-
ignorant-journalism/.
Eve, M. (2016). Password. New York : Bloomsbury.
Fagone, J. (2017). The Woman Who Smashed Codes: A True Story of Love, Spies, and the
Unlikely Heroine Who Outwitted America’s Enemies. New York: Dey Street Books.
Feistel, H. (1970, Mar. 18). Cryptographic Coding for Data-Bank Privacy: RC 2827. IBM.
Feistel, H. (1973). Cryptography and Computer Privacy. Scientific American, (228): 15-23.
Felt, A.P., Barnes, R., King, A., Palmer, C., Bentzel, C., and Tabriz, P. (2017). Measuring
HTTPS Adoption on the Web. 26th USENIX Security Symposium. Retrieved from
https://www.usenix.org/conference/usenixsecurity17/technical-sessions/presentation/felt.
210
Flanagin, A. J., Flanagin, C., & Flanagin, J. (2010). Technical code and the social construction of
the internet. New Media & Society. 12(2), 179-196.
Flynn, L. (1997, Aug. 27). Ruling Frees Professor to Teach Cryptography. New York Times.
Retrieved from https://partners.nytimes.com/library/cyber/week/082797encrypt.html.
Franceschi-Bicchierai, L. (2015, Sept. 2). Even the Inventor of PGP Doesn’t Use PGP.
Motherboard. Retrieved from https://motherboard.vice.com/en_us/article/vvbw9a/even-
the-inventor-of-pgp-doesnt-use-pgp.
Franceschi-Bicchierai, L. (2017, Nov. 28). Cryptocurrencies Aren’t ‘Crypto’. Motherboard.
Retrieved from https://motherboard.vice.com/en_us/article/43nk9b/cryptocurrency-are-
not-crypto-bitcoin.
Frauenfelder, M. (1997, Feb. 17). Homeless Cypherpunks Turn to Usenet. WIRED. Retrieved
from https://www.wired.com/1997/02/homeless-cypherpunks-turn-to-usenet/.
Freeman, J. (1972). The Tyranny of Structurelessness. The Second Wave, 2(1). Retrieved from
http://www.jofreeman.com/joreen/tyranny.htm.
Froomkin, A.M. (1995). The Metaphor is the Key: Cryptography, the Clipper Chip, and the
Constitution. University of Pennsylvania Law Review, 709. Retrieved from
http://osaka.law.miami.edu/~froomkin/articles/clipper1.htm#ToC27.
Galloway, A. (2004). Protocol: How control exists after decentralization. Cambridge: MIT
Press.
Gardner, M. (1977). A New Kind of Cipher That Would Take Millions of Years to Break.
Scientific American, 237(2): 120-124.
Garfinkel, S. (1995). PGP: Pretty Good Privacy. Beijing: O’Reilly & Associates, Inc.
211
Garfinkel, S. (1995b). The continuing investigation of Phil Zimmermann. WIRED. Retrieved
from https://www.wired.com/1995/03/the-continuing-investigation-of-phil-zimmermann/.
Gillespie, T. (2006). Engineering a Principle: ‘End-to-End’ in the Design of the Internet. Social
Studies of Science. 36(3): 427-457.
Gimon, C.A. (1995, Jun.). The Phil Zimmerman Case. InfoNation.
Goffman, E. (1959). The Presentation of Self in Everyday Life. New York: Anchor Books.
Goldschlag, D, Reed, M. and Syverson, P. (1999). Onion Routing for Anonymous and Private
Internet Connections. Communications of the ACM, 42(2): 39-41.
Gonggrijp, R. and Rieger, F. (2005). We lost the war: Welcome to the world of tomorrow.
Media.ccc.de. Retrieved from https://media.ccc.de/v/22C3-920-en-we_lost_the_war.
Gonggrijp, R. and Rieger, F. (2015). Ten years after, ‘We Lost The War’. Media.ccc.de.
Retrieved from https://media.ccc.de/v/32c3-7501-ten_years_after_we_lost_the_war.
Greene, T. (2011). From cloud and mobile security to encryption, security concerns abound as
RSA turns 20. NetworkWorld. Retrieved from
https://www.networkworld.com/article/2199500/security/from-cloud-and-mobile-
security-to-encryption--security-concerns-abound-as-rsa-turns-20.html.
Greenberg, A. (2012). This Machine Kills Secrets. New York: Plume.
Gürses, S., Kundnani, A., & Van Hoboken, J. (2016). Crypto and empire: The contradictions of
counter-surveillance advocacy. Media, Culture & Society, 38(4), 576-590.
Hall, S. (1973). Encoding/Decoding. In Durham, M.G. and Kellner, D.M. (Eds.) Media and
Cultural Studies: Keyworks. Malden: Blackwell Publishing.
Hall, S. (1977). Culture, the Media and the ‘Ideological Effect’. In Curran, J. et al (Eds.) Mass
Communication and Society. London: Arnold.
212
Harris, S. (2014). @War: The rise of the military-Internet complex. Boston: Houghton Mifflin
Harcourt.
Harrison, S. (1998). E-mail discussions as conversation: moves and acts in a sample from a
listserv discussion. Linguistik online, 1(1).
Hellegren, Z.I. (2017). A History of Crypto-Discourse: Encryption as a Site of Struggles to
Define Internet Freedom. Internet Histories, 1(4): 285-311.
Hellman, M. (1975, Oct. 22). Personal communication.
Hellman, M. (1976, Jan. 16). Personal communication.
Hellman, M. (1976, Feb. 23). Personal communication.
Hellman, M. (1977, Oct. 3). Personal communication.
Hellman, M. (2004). Oral history interview with Martin Hellman. Charles Babbage Institute.
Retrieved from https://conservancy.umn.edu/handle/11299/107353.
Hellman, M. (n.d.). Beware the ideas of October… In Autobiography Ch. 1.
Herring, S. C (1996). "Introduction". In Herring, S.C. (Ed.): Computer-Mediated
Communication: Linguistic, Social and Cross-Cultural Perspectives. Amsterdam: John
Benjamins.
Hershkovits, D. (2015, Apr. 22). John Perry Barlow Talks Acid, Cyber-Independence and his
Friendship with JFK Jr. Paste Magazine. Retrieved from
http://www.papermag.com/john-perry-barlow-talks-acid-cyber-independence-and-his-
friendship-wit-1427554020.html.
Hintz, A. and Milan, S. (2010). Social Science is Police Science: Researching Grass-Roots
Activism. International Journal of Comunication, 4.
213
Hoffman, D.N. (1981). Governmental Secrecy and the Founding Fathers. Westport: Greenwood
Press.
Homeland Security Committee. (2016, June). Going Dark, Going Forward: A Primer on the
Encryption Debate. House Homeland Security Committee Majority Staff Report.
Retrieved from https://homeland.house.gov/press/house-homeland-security-committee-
releases-encryption-report-going-dark-going-forward-primer-encryption-debate/.
Huffington Post. (2010). Google CEO On Privacy. Huffington Post. Retrieved from
http://www.huffingtonpost.com/2009/12/07/google-ceo-on-privacy-if_n_383105.html.
Hughes, T. (1987). The Evolution of Large Technological Systems. In Bijker, W.E., Hughes,
T.P., Pinch, T. (Eds.), The Social Construction of Technological Systems: New Directions
in the Sociology and History of Technology. Cambridge: MIT Press.
Hull, D. (1985). Openness and Secrecy in Science: Their Origins and Limitations. Science,
Technology & Human Values, 10(2): 4-13.
Human Rights Council. (2013). Report of the Special Rapporteur on the promotion and
protection of the right to freedom of opinion and expression. United Nations General
Assembly.
Human Rights@Harvard Law (n.d.). In re South African Apartheid Litigation,
HumanRights@Harvard Law. Retrieved from http://hrp.law.harvard.edu/areas-of-
focus/previous-areas-of-focus/in-re-south-african-apartheid-litigation/.
IBM. (1970). Considerations of Data Security in a Computer Environment, IBM.
Jackson, S. (2014). Rethinking Repair. In Gillespie, T., Boczkowski, P.J. and Foot, K.A. (Eds.)
Media Technologies: Essays on Communication, Materiality, and Society. Cambridge:
MIT Press.
214
Jagodzinski, C. (1999). Privacy and Print: Reading and writing in seventeenth-century England.
Charlottesville: University Press of Virginia.
Joh, E.E. (2016). Beyond Surveillance: Data Control and Body Cameras. Surveillance & Society,
14(1): 133-137.
Jordan, Tim. (2016). A genealogy of hacking. Convergence: The International Journal of
Research into New Media Technologies, 23(5): 528-544.
Juris, J.S. (2008). Networking Futures: The Movements Against Corporate Globalization.
Durham: Duke University Press.
Kahn, D. (1967). The Codebreakers: The story of secret writing. New York: Macmillan.
Kahn, D. (1976, April 3). Tapping Computers. New York Times.
Kang, C. (2016, May 8). Police and Tech Giants Wrangle Over Encryption on Capitol Hill. New
York Times.
Kehl, D., Wilson, A. and Bankston, K. (2015). Doomed to repeat history?: Lessons from the
Crypto Wars of the 1990s. Open Technology Institute. Retrieved from
https://static.newamerica.org/attachments/3407--
125/Lessons%20From%20the%20Crypto%20Wars%20of%20the%201990s.882d6156dc
194187a5fa51b14d55234f.pdf
Kelly, K. (1993). Cypherpunks, e-Money and the Technologies of Disconnection. Whole Earth
Review.
Kelty, C. (2005). Geeks, Social Imaginaries, and Recursive Publics. Cultural Anthropology,
20(2): 185-214.
Kelty, C. (2008). Two Bits: The Cultural Significance of Free Software. Durham: Duke
University Press.
215
Kolata, G. (1977a, July 29). Computer Encryption and the National Security Agency
Connection. Science, 197: 438-440.
Kolata, G. (1977b, Aug. 19). Cryptography: On the Brink of a Revolution. Science, 197: 747-
748.
Konheim, A.G. (1973, May 30). Personal communication.
Konheim, A.G. (2015). The Impetus to Creativity in Technology. Cryptologia, 39(4): 291-314.
Kline, R. and Pinch, T. (1996). Users as Agents of Technological Change: The Social
Construction of the Automobile in the Rural United States. Technology and Culture, 37:
763-795.
Kreiss, D, and McGregor, S. (2017). Technology Firms Shape Political Communication: The
Work of Microsoft, Facebook, Twitter, and Google With Campaigns During the 2016
U.S. Presidential Cycle. Political Communication, 35(2): 155-177.
Kubitschko, S. (2017). Acting on media technologies and infrastructures: expanding the media as
practice approach. Media, Culture and Society
Latour, B. (1991). Technology is society made durable. In J. Law (Ed.), A Sociology of
Monsters: Essays on power, technology and domination. London, UK: Routledge.
Latour, B. (2005). Reassembling the Social: An introduction to actor-network-theory. Oxford,
UK: Oxford University Press.
Lauer, J. (2017). Creditworthy: A History of Consumer Surveillance and Financial Identity in
America. New York: Columbia University Press.
Leonardi, P. M. (2012). Materiality, Sociomateriality, and Socio-technical Systems: What do
these terms mean? How are they related? Do we need them? In P. M. Leonardi, B. A.
216
Nardi & J. Kallinikos (Eds.), Materiality and organizing: Social interaction in a
technological world. Oxford, UK: Oxford University Press.
Levy, S. (1993). Rebels with a Cause (Your Privacy). WIRED. Retrieved from
https://www.wired.com/1993/02/crypto-rebels/.
Levy, S. (1994). Battle of the Clipper Chip. New York Times Magazine. Retrieved from
http://www.nytimes.com/1994/06/12/magazine/battle-of-the-clipper-
chip.html?pagewanted=5.
Levy, S. (1994b). Cypher Wars. WIRED. Retrieved from
https://www.wired.com/1994/11/cypher-wars/.
Levy, S. (2001). Crypto: How the code rebels beat the government – saving privacy in the digital
age. New York: Viking.
Lingel, J. (2017). Networked Field Studies: Comparative Inquiry and Online Communities.
Social Media+ Society, 3(4): 1-9.
Linn, J. (1989). RFC 1113-5: Privacy Enhancement for Internet Electronic Mail, Parts I-III,
Internet Engineering Task Force. Retrieved from https://tools.ietf.org/html/rfc1113,
https://tools.ietf.org/html/rfc1114, and https://tools.ietf.org/html/rfc1115.
Ludlow, P., Ed. (2001). Crypto Anarchy, Cyberstates, and Pirate Utopias. Cambridge: MIT
Press.
Mac, R., Warzel, C., Kantrowitz, A. (2018, Mar. 29). Growth At Any Cost: Top Facebook
Executive Defended Data Collection In 2016 Memo — And Warned That Facebook
Could Get People Killed. Buzzfeed. Retrieved from
https://www.buzzfeed.com/ryanmac/growth-at-any-cost-top-facebook-executive-
defended-data?utm_term=.xl1p2Nob2#.ckWAm25rm
217
Mackrackis, K. (2010). Confessing Secrets: Secret Communication and the Origins of Modern
Science. Intelligence and National Security, 25(2): 183-197.
Mahmud, S. (2007, May 18). Technology/Media: Growing internet censorship worldwide –
OpenNet Initiative. Wanabehuman. Retrieved from
https://wanabehuman.blogspot.com/2007/05/technologymedia-growing-internet.html.
Mann, S. and Ferenbok, J. (2013). New Media and the Power Politics of Sousveillance in a
Surveillance-Dominated World. Surveillance and Society, 11(1/2): 18-34.
Marcus, G. (1995). Ethnography in/of the World System: The Emergence of Multi-Sited
Ethnography. Annual Review of Anthropology, 24: 95-117.
Marechal, N. (forthcoming). Use Signal, Use Tor? The Political Economy of Digital Rights
Technology (Unpublished doctoral dissertation). University of Southern California, Los
Angeles, CA.
Markoff, J. (1993). Communications Plan Draws Mixed Reaction. New York Times. Retrieved
from http://www.nytimes.com/1993/04/17/business/communications-plan-draws-mixed-
reaction.html.
Markoff, J. (1994a). Profit and Ego in Data Secrecy. New York Times. Retrieved from
http://www.nytimes.com/1994/06/28/business/profit-and-ego-in-data-
secrecy.html?pagewanted=all.
Markoff, J. (1994b). A Secret Computer Code Is Out. New York Times. Retrieved from
http://www.nytimes.com/1994/09/17/business/company-news-a-secret-computer-code-is-
out.html.
218
Markoff, J. (1996a). Data-Secrecy Export Case Dropped by U.S. New York Times. Retrieved
from http://www.nytimes.com/1996/01/12/business/data-secrecy-export-case-dropped-
by-us.html.
Markoff, J. (1996b). Judge Rules Against U.S. in Encryption Case. New York Times. Retrieved
from http://www.nytimes.com/1996/12/19/business/judge-rules-against-us-in-encryption-
case.html.
Marvin, C. (1988). When Old Technologies Were New. New York: Oxford University Press.
Marwick, A.E. and boyd, d. (2014). Networked privacy: How teenagers negotiate context in
social media. New Media & Society, 16(7): 1051-1067.
Marx, L. (2010). Technology: The Emergence of a Hazardous Concept. Technology and Culture,
51(3): 561-577.
Mathews, D. (2015, May 20). Defence laws threaten to shut down Aussie encryption. ITNews.
May, T.C. (1992). The Crypto Anarchist Manifesto. Retrieved from
https://groups.csail.mit.edu/mac/classes/6.805/articles/crypto/cypherpunks/may-crypto-
manifesto.html.
McCarthy, D.R. (2010). Open Networks and the Open Door: American Foreign Policy and the
Narration of the Internet. Foreign Policy Analysis, 7(1): 89-111.
McKelvey, F. (2014). Algorithmic Media Need Democratic Methods: Why Publics Matter.
Canadian Journal of Communication. 39(4): 597-614.
Menn, J. (2013, Dec. 20). Exclusive: Secret contract tied NSA and security industry pioneer.
Reuters. Retrieved from https://www.reuters.com/article/us-usa-security-rsa/exclusive-
secret-contract-tied-nsa-and-security-industry-pioneer-idUSBRE9BJ1C220131220
219
Metz, C. (2016, Apr. 5). Forget Apple vs. FBI: WhatsApp just switched on encryption for a
billion people. WIRED. Retrieved from https://www.wired.com/2016/04/forget-apple-vs-
fbi-whatsapp-just-switched-encryption-billion-people/.
Meyer, D. (2016). Brazil Arrests Senior Facebook Exec Over WhatsApp Aid in Drug Case.
Forbes. Retrieved from http://fortune.com/2016/03/01/brazil-facebook-arrest/.
Meyer, J.A. (1977, July 7). Personal communication.
Milan, S. (2013). Social Movements and Their Technologies: Wiring social change. New York:
Palgrave Macmillan.
Mollin, R.A. (2006). An Introduction to Cryptography, Second Edition. Boca Raton: CRC Press.
Moody, G. (2015, Jul. 3). New Dutch law would allow bulk surveillance, compelled decryption.
Ars Technica UK. Retrieved from https://arstechnica.com/tech-policy/2015/07/new-
dutch-law-would-allow-bulk-surveillance-compelled-decryption/.
Morris, J.W. (1977, Dec. 6). IBM Announces New Products to Protect Computer Information.
International Business Machines Corporation.
Mosaic. (n.d.). Mosaic Communications.
Moynihan, D.P, Combest, L. (1997). Secrecy: Report of the commission on protecting and
reducing government secrecy. Darby: Diane Publishing.
Mundy, L. (2017). Code Girls: The Untold Story of the American Women Code Breakers of
World War II. New York: Hachette Books.
Murray, A. (2017, June 25). The story behind the world’s first cashpoint. The Telegraph.
Retrieved from http://www.telegraph.co.uk/personal-banking/current-accounts/story-
behind-worlds-first-cashpoint/.
220
Myers, A. (2016, Mar. 1). Martin Hellman: Finding the truth is more important than getting your
way. Stanford Magazine. Retrieved from
https://engineering.stanford.edu/magazine/article/martin-hellman-finding-truth-more-
important-getting-your-way.
Nagy, J.A. (2010). Invisible Ink: Spycraft of the American Revolution. Yardley: Westholme.
Narayanan, A. (2013). What Happened to the Crypto Dream?, Part 1. IEEE Computer and
Reliability Societies, 75-76.
Narayanan, A. and Clark, J. (2017, Aug. 29). Bitcoin’s Academic Pedigree. ACM Queue, 15(4):
1-18.
National Bureau of Standards. (1974, Apr.). The NBS Encryption Program: Status Report: April
1974. National Bureau of Standards.
National Security Agency. (1972, Sept. 18). Memorandum of Understanding Between the
National Security Agency and the National Bureau of Standards Relative to Computer
Security Techniques for the Protection of Data and Control of Data Access. National
Security Agency.
Netscape. (1995). Internet Based Commercial Applications. Netscape Communications
Corporation.
Nissenbaum, H. (2010). Privacy in Context: Technology, Policy and the Integrity of Social Life.
Stanford: Stanford Law Books.
Noble, S. (2018). Algorithms of Oppression: How Search Engines Reinforce Racism. New York:
NYU Press.
Nordin, A. & Richaud, L. (2014). Subverting official language and discourse in China?: Type
river crab for harmony. China Information. 28(1): 47-67.
221
Oldenziel, R. (1999). Making Technology Masculine: Men, Women, and Modern Machines in
America, 1870-1945. Amsterdam: Amsterdam University Press.
Oxford. (2017). Encryption. Oxford English Dictionary. Retrieved from
https://en.oxforddictionaries.com/definition/encryption.
Pasquale, F. (2014). The Black Box Society. Cambridge: Harvard University Press.
Perlroth, N., Larson, J. and Shane, S. (2013, Sept. 5). NSA Able to Foil Basic Safeguards of
Privacy on Web. New York Times. Retrieved from
http://www.nytimes.com/2013/09/06/us/nsa-foils-much-internet-encryption.html.
Pinch, T.J. and Bijker, W.E. (1987). The Social Construction of Facts and Artifacts: Or How the
Sociology of Science and the Sociology of Technology Might Benefit Each Other, in
Bijker, W.E., Hughes, T.P., Pinch, T. (Eds.), The Social Construction of Technological
Systems: New Directions in the Sociology and History of Technology. Cambridge: MIT
Press.
Portwood-Stacer, L. (2013). Lifestyle Politics and Radical Activism. London: Bloomsbury
Publishing.
Potter, L. (1989). Secret Rites and Secret Writing : Royalist literature, 1641-1660. Cambridge:
New York: Cambridge University Press.
Powers, S. and Jablonski, M. (2015). The Real Cyber War: The political economy of internet
freedom. Champlain: University of Illinois Press.
Price, R. (2015, July 1). David Cameron is going to try and ban encryption in Britain. Business
Insider. Retrieved from http://www.businessinsider.com/david-cameron-encryption-back-
doors-iphone-whatsapp-2015-7.
Protect Our Spaces. (2017). Protect Our Spaces. Retrieved from https://protectourspaces.org/.
222
Public Cryptography Study Group. (1981). Report of the Public Cryptography Study Group.
Cryptologia, 5(3), 130–142. Retrieved from https://doi.org/10.1080/0161-118191855931.
Reeds, J. (2010). Solved: The Ciphers in Book III of Trithemius’s Steganographia. Cryptologia,
22(4): 291-317.
Rid, T. (2016). Rise of the Machines: A cybernetic history. New York: W.W. Norton and
Company.
Rider, K. (2017). The privacy paradox: how market privacy facilitates government surveillance.
Information, Communication and Society.
Rivest, R.L. (2016). Oral History of Ronald L. Rivest. Computer History Museum. Retrieved
from http://archive.computerhistory.org/resources/access/text/2017/07/102717255-05-01-
acc.pdf.
Rivest, R.L., Shamir, A., Adleman, L. (1978). A Method For Obtaining Digital Signatures and
Public-Key Cryptosystems. Communications of the ACM, 21(2): 120-126.
Roberts, A. (2006). Blacked Out: Government secrecy in the information age. Cambridge:
Cambridge University Press.
Rodger, W. (2001, Nov. 30). Cypherpunks RIP. The Register. Retrieved from
https://www.theregister.co.uk/2001/11/30/cypherpunks_rip/.
Rogaway, P. (2015). The Moral Character of Cryptographic Work. IACR Cryptology ePrint
Archive. Retrieved from http://web.cs.ucdavis.edu/~rogaway/papers/moral.html.
Rosenberg, A. (2003). Cryptologists: Life making and breaking codes. New York: Rosen
Publishing Group.
Rosenheim, S.J. (1997). The Cryptographic Imagination: Secret Writing From Edgar Allen Poe
to the Internet. Baltimore: The Johns Hopkins University Press.
223
Rosenthal, I. (1993). CFP’93 – Export Controls on Mass Market Software with Encryption
Capabilities. Computer Professionals for Social Responsibility. Retrieved from
http://cpsr.org/prevsite/conferences/cfp93/rosenthal.html/.
RSA Data Security. (1994, Mar. 21). RSAREF™: A Cryptographic Toolkit. RSA Laboratories.
Retrieved from
http://www.mit.edu/afs.new/sipb/user/yoav/athena/work/lcs/src/rsaref/doc/info.txt.
RSA Data Security. (2002). RSA Data Security. In J. A. Malonis (Ed.), Gale Encyclopedia of E-
Commerce (2): 634-636. Detroit: Gale.
Rubinstein, I.S. and Hintze, M. (2000). Export Controls on Encryption Software. Coping with
U.S. Export Controls 2000. Retrieved from
http://encryption_policies.tripod.com/us/rubinstein_1200_software.htm.
Schneier, B. (1996). Applied Cryptography, 20th Anniversary. Hoboken: John Wiley & Sons.
Schneier, B. (2015). Data and Goliath : The hidden battles to collect your data and control your
world. New York: W.W. Norton and Co.
Schulte, S. R. (2013). Cached: Decoding the internet in global popular culture. New York: New
York University Press.
Schwarz Jr., F. (2015). Democracy in the Dark: The seduction of government secrecy. New
York: The New Press.
Schwartz, J. (2001, Sept. 25). Disputes on Electronic Message Encryption Take On New
Urgency. New York Times. Retrieved from
https://www.nytimes.com/2001/09/25/business/disputes-on-electronic-message-
encryption-take-on-new-urgency.html.
224
Scott, J.C. (1990). Domination and the Arts of Resistance: Hidden transcripts. New Haven: Yale
University Press.
Scott, J.W. (1988). Gender and the Politics of History. New York: Columbia University Press.
Shaffer, R.A. (1978, June 16). Cryptic Reaction: Companies Use Codes To Ward Off Thieves
and Safeguard Secrets. The Wall Street Journal.
Shapely, D. (1977, Sept. 9). Telecommunications Eavesdropping by NSA on Private Messages
Alleged. Science, 1061-2.
Shapin, S. and Schaffer, S. (1985). Leviathan and the Air-Pump: Hobbes, Boyle, and the
experimental life. Princeton: Princeton University Press.
Simmel, G. (1906). The Sociology of Secrecy and of Secret Societies. American Journal of
Sociology, 11(4): 441-498.
Singh, S. (1999). The Code Book : The evolution of secrecy from Mary Queen of Scots to
quantum cryptography (1st ed.). New York: Doubleday.
Sixsmith, J., & Murray, C. D. (2001). Ethical issues in the documentary data analysis of Internet
posts and archives. Qualitative Health Research, 11(3), 423-432.
Solove, D.J. (2007). "I've got nothing to hide" and other misunderstandings of privacy. San
Diego Law Review, 44(4), 745-772.
Squires, C. (2006). Rethinking the Black Public Sphere: An Alternative Vocabulary for Multiple
Public Spheres. Communication Theory, 12(4): 446-468.
Star, S. L., & Ruhleder, K. (1996). Steps toward an ecology of infrastructure: Design and access
for large information spaces. Information Systems Research, 7(1), 111-134.
Star, S.L. (1999). The Ethnography of Infrastructure. The American Behavioral Scientist, 43(3):
377.
225
Streeter, T. (2011). The Net Effect: Romanticism, capitalism, and the internet. New York: New
York University Press.
Sussman, V. (1995, Mar. 28). Lost in Kafka Territory. US News and World Report.
Swartz, L. (2018). What was Bitcoin, what will it be? The techno-economic imaginaries of a new
money technology. Cultural Studies.
Taylor, C. (2002).. Modern Social Imaginaries. Public Culture, 14(1): 91-124.
Taylor, C. (2004). Modern Social Imaginaries. Durham: Duke University Press.
TIME. (1978, July 3). An Uncrackable Code? TIME.
Tufekci, Z. (2015). Algorithmic Harms beyond Facebook and Google: Emergent Challenges of
Computational Agency. Colorado Technology Journal, 13(2): 203-216.
Turner, F. (2006). From Counterculture to Cyberculture: Stewart Brand, the Whole Earth
Network, and the rise of digital utopianism. Chicago: University of Chicago Press.
United Nations. (2015). Report on encryption, anonymity, and the human rights framework.
United Nations Human Rights Office of the High Commissioner. Retrieved from
http://www.ohchr.org/EN/Issues/FreedomOpinion/Pages/CallForSubmission.aspx.
Van der Leun, G., Godwin, M., Kapor, M. (1991). EFFector Online, 1(6). Retrieved from
https://www.eff.org/effector/1/6.
Van Dijck, J. (2013). Facebook and the engineering of connectivity: A multi-layered approach to
social media platforms. Convergence. 19(2): 141-155.
Vinge, V. (2001). True Names and the Opening of the Cyberspace Frontier. New York: Tor
Books.
Volkman, E. (1977, Oct. 26). Spying Motive Seen in U.S. Rule on Computer Security. Los
Angeles Times, 8.
226
Warner, M. (2002). Publics and Counterpublics. Cambridge: MIT Press.
Weaver, W. (1949, July). The Mathematics of Communication. Scientific American, 181(1): 11-
16.
Weber, M. (1946). Bureaucracy, In Gerth, H.H. and Mills, C.W. (Eds.) From Max Weber:
Essays in Sociology. New York: Oxford University Press.
West, S.M. (2017). Data Capitalism: Redefining the Logics of Surveillance and Privacy.
Business & Society.
Wikileaks. (2007). Wikileaks.org. Retrieved from
https://web.archive.org/web/20070202025339/http://wikileaks.org:80/.
Williams, J. (2001). The Invisible Cryptologists: African Americans, WWII to 1956. Center for
Cryptologic History, National Security Agency. Retrieved from
https://www.nsa.gov/about/cryptologic-heritage/historical-figures-publications/african-
americans/.
Winner, L. (1980). Do artifacts have politics? Daedalus, 109(1): 121-136.
Wired. (1994). Anonymously Yours – An Interview with Johan Helsingius. WIRED. Retrieved
from https://motherboard.vice.com/en_us/article/vvbw9a/even-the-inventor-of-pgp-
doesnt-use-pgp.
Wolfson, T. (2014). Digital Rebellion: The birth of the cyber left. Urbana: University of Illinois
Press.
Wyatt, S. (2008). Technological Determinism is Dead; Long Live Technological Determinism,
in Hackett, E.J., Amsterdamska, O., Lynch, M. and Wacjman, J. (Eds.), The Handbook of
Science and Technology Studies. Cambridge: MIT Press.
227
Yasaki, E.K. (1976, Mar.). Encryption Algorithm: Key Size is the Thing. Datamation. Retrieved
from http://onlinebooks.library.upenn.edu/webbin/book/lookupid?key=olbp37801.
Yost, J.R. (2004, Nov. 22). An Interview with Martin Hellman. Charles Babbage Institute,
Center for the History of Information Technology. Retrieved Sept. 24, 2017 from
https://conservancy.umn.edu/handle/11299/107353.
Yost, J.R. (2015). The Origin and Early History of the Computer Security Software Products
History. IEEE Annals of the History of Computing, 37(2): 46-58.
Zetter, K. (2013, Sept. 19). RSA tells its developer customers: Stop using NSA-linked algorithm.
WIRED. Retrieved from https://www.wired.com/2013/09/rsa-advisory-nsa-algorithm/.
Zimmermann, P. (1991). Why I Wrote PGP. PGP User’s Guide. Retrieved from
https://www.philzimmermann.com/EN/essays/WhyIWrotePGP.html.
Zimmermann, P. (1995). Author’s preface. In PGP Source Code and Internals. Cambridge: MIT
Press.
Zimmermann, P. (2001, Sept. 24). Philip Zimmerman and ‘Guilt’ Over PGP. Slashdot. Retrieved
from https://interviews.slashdot.org/story/01/09/24/162236/philip-zimmermann-and-
guilt-over-pgp.
Zuboff, S. (2015). Big other: Surveillance capitalism and the prospects of an information
civilization. Journal of Information Technology, 30(1): 75-89.
Zuckerberg, M. (2018, Apr. 11). Testimony, Hearing Before the United States House of
Representatives Committee on Energy and Commerce. Retrieved from
http://docs.house.gov/meetings/IF/IF00/20180411/108090/HHRG-115-IF00-Wstate-
ZuckerbergM-20180411.pdf.
Abstract (if available)
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
Use Signal, use Tor? The political economy of digital rights technology
PDF
Tokens, ledgers, and rails: the communication of money
PDF
Sunsetting: platform closure and the construction of digital cultural loss
PDF
Driving change: copyright, car modding, and the right to repair in the digital age
PDF
Privacy for all: enshrining data privacy as a civil right in the Information Age
PDF
Temporal dynamics in the lives of health practitioners
PDF
The Silicon Valley startup ecosystem in the 21st century: entrepreneurial resilience and networked innovation
PDF
Home screen home: how parents of children with disabilities navigate family media use
PDF
Progressive technology, stagnant politics: how dating apps perpetuate gender hierarchy and transform relationships in urban China
PDF
Participatory public culture and youth citizenship in the digital age: the Medellín model
PDF
Mobile apps and services for development: What can we learn from the non-smartphone era in ICT4D?
PDF
Spaces of autonomy and polarization: toward a theory of the globalization of economic and political cultures characteristic of American journalism
PDF
Emergent media technologies and the production of new urban spaces
PDF
Passionate work and the good life: survival and affect after the recession
PDF
Cosmopolitan logics and limits: networked discourse, affect, and identity in responses to the Syrian refugee crisis
PDF
Conspicuous connections as signals of expertise in networks
PDF
The informationalization of race: communication technologies and genomics in the information age
PDF
Crisis and stasis: understanding change in online communities
PDF
Upvoting the news: breaking news aggregation, crowd collaboration, and algorithm-driven attention on reddit.com
PDF
Digital creativity and innovation in Chinese social network sites industry
Asset Metadata
Creator
West, Sarah Myers
(author)
Core Title
Cryptographic imaginaries and networked publics: a cultural history of encryption technologies, 1967-2017
School
Annenberg School for Communication
Degree
Doctor of Philosophy
Degree Program
Communication
Publication Date
06/29/2020
Defense Date
05/02/2018
Publisher
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
communication,cryptography,cultural history,ethnography,networks,OAI-PMH Harvest,privacy,surveillance
Format
application/pdf
(imt)
Language
English
Contributor
Electronically uploaded by the author
(provenance)
Advisor
Castells, Manuel (
committee chair
), Ananny, Mike (
committee member
), Dunbar-Hester, Christina (
committee member
), Kelty, Christopher (
committee member
)
Creator Email
sarahbmyers@gmail.com,sarahmye@usc.edu
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-c40-508539
Unique identifier
UC11268384
Identifier
etd-WestSarahM-6365.pdf (filename),usctheses-c40-508539 (legacy record id)
Legacy Identifier
etd-WestSarahM-6365.pdf
Dmrecord
508539
Document Type
Dissertation
Format
application/pdf (imt)
Rights
West, Sarah Myers
Type
texts
Source
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Access Conditions
The author retains rights to his/her dissertation, thesis or other graduate work according to U.S. copyright law. Electronic access is being provided by the USC Libraries in agreement with the a...
Repository Name
University of Southern California Digital Library
Repository Location
USC Digital Library, University of Southern California, University Park Campus MC 2810, 3434 South Grand Avenue, 2nd Floor, Los Angeles, California 90089-2810, USA
Tags
communication
cryptography
cultural history
ethnography
networks
privacy
surveillance