Close
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
Sacrificing cost, performance and usability for privacy: understanding the value of privacy in a multi‐criteria decision problem
(USC Thesis Other)
Sacrificing cost, performance and usability for privacy: understanding the value of privacy in a multi‐criteria decision problem
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
Running head: PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 1
Sacrificing Cost, Performance and Usability for Privacy: Understanding the Value of Privacy in a
Multi-Criteria Decision Problem
by
Kenneth D. Nguyen
A Thesis Presented to the
FACULTY OF THE GRADUATE SCHOOL
UNIVERSITY OF SOUTHERN CALIFORNIA
In Partial Fulfillment of the
Requirements for the Degree
MASTER OF ARTS
(PSYCHOLOGY)
August 2015
PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 2
Author Note
This research was supported by the United States U.S. Department of Homeland Security
through the National Center for Risk and Economic Analysis of Terrorism Events (CREATE) under
award number 2010-ST-061-RE0001. However, any opinions, findings, and conclusions or
recommendations in this document are those of the authors and do not necessarily reflect views of
the United States U.S. Department of Homeland Security, or the University of Southern California,
or CREATE.
Correspondence concerning this article should be addressed to Kenneth Nguyen,
Department of Psychology, University of Southern California, 3715 McClintock Avenue, Los
Angeles, CA, 90089-0191. Email: hoangdun@usc.edu
PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 3
Acknowledgement
I would like to thank my advisor, Dr. Richard John, for his guidance and encouragement
throughout this scholarly work. I also wish to thank Dr. John J. McArdle and Dr. Morteza Dehghani
for their thoughtful feedback on my work.
I am especially grateful to my aunt Stephanie Dao and my uncle Danny Nguyen for their
tremendous support throughout my undergraduate education. Finally, there is no word that can
describe my gratitude to my mother, who always tries her best to give me the best life possible, and
my girlfriend (soon to be my wife) Thanh Nguyen for being with me through difficult time.
PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 4
Table of Contents
Chapter 1. Introduction .....................................................................................................................8
Estimating the value of privacy protection……………………………………………………9
The effects of different privacy threats………………………………………………………12
The effects of individual characteristics .................................................................................16
Chapter 2. Modeling Approach ........................................................................................................18
Multi-attribute utility theory…………………………………………………………………18
Cognitive biases in trade-off elicitation……………………………………………………...19
The “indifference procedure” ……………………………………………………………….20
Consistency check……………………………………………………………………..…… 22
Chapter 3. Method .............................................................................................................................24
Experimental procedure ...........................................................................................................24
Attribute definitions ................................................................................................................24
Elicitation procedure ................................................................................................................26
Privacy threat manipulation .....................................................................................................27
Choice Presentation .................................................................................................................28
Chapter 4. Results ..............................................................................................................................30
The Value of a Secured Smartphone ......................................................................................30
The effect of different types of privacy ...................................................................................33
The effect of individual characteristics ....................................................................................35
Consistency check results ........................................................................................................36
Chapter 5. Discussion .......................................................................................................................39
References ..........................................................................................................................................45
PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 5
List of Tables
Table 1. Different types of privacy threat………………………………………………………...52
Table 2. Objectives and attributes presented to subjects………………………………………….52
Table 3. Median trade-off values for an encrypted smartphone…………………………………..53
Table 4. Binary logistic regression results………………………………………………………...54
Table 5. Medians of correlations between attribute set judgments ……………………………….55
PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 6
List of Figures
Figure 1. Hypothetical trade-off between privacy and cost…………………………………..56
Figure 2. A screen shot of our choice presentation…………………………………………… 56
Figure 3. Cumulative distributions of trade off values for an encrypted smartphone………… 57
PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 7
Abstract
Increasing interest in technology to enhance online information privacy makes research on
understanding how people value their privacy a priority. The current research aims to quantify
the value of privacy protection in the context of smartphone technology by estimating the
trade-offs that people are willing to make for privacy protection. We also investigated how the
value of privacy protection varied by contextual variables and individual characteristics. To do
so, we adapted a paradigm proposed by Tversky, Sattah, and Slovic (1988) to examine the
value of privacy protection by estimating trade-offs for an encrypted smartphone. The results
show that respondents were willing to pay non-trivial premiums for smartphone privacy
protection. Interestingly, ambiguously describing a privacy threat increased the value of
privacy protection compared to depicting the threat with more contextual detail. We also
observed the effects of general privacy concern, age, and self-reported political attitude on the
value of privacy protection. These results demonstrate the importance of accounting for user
values in the design of a usable security system.
Keyword: privacy premium, trade-off preference, construal levels theory
PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 8
Chapter 1. Introduction
The growing public concern over cyber security entails an urgent need to protect the
safety and security of personal information. The widespread applications of technology
undoubtedly open new opportunities for people to work, socialize, and entertain. However, the
omnipresence of technology also exposes people to a wide array of privacy risks, resulting in a
strong public sentiment to privacy violations (Culnan & Milne 2001; Gandy, 2003; Gross &
Acquisti, 2005; Utz & Kramer, 2009). Such growing public concern over information privacy
motivates the need for effective solutions to secure personal information in cyber space.
The current privacy literature identifies three major approaches for increasing
information privacy including the use of legislation, industry self-regulation, and individual
self-protection. The focus of the current research is on the last alternative. The primary reason
being that individual self-protection is the most relevant and perhaps most effective means for
securing personal information. This is because the continuing emergence of new technologies
ensures that users will encounter new types of privacy risks. As a result, users have to acquire
relevant security knowledge and/or adopt appropriate technologies to protect themselves from
these emerging threats. Echoing this point, Kruse encourages people to acquire the skills
necessary to manage their technological environment in a self-determined way (cited in Moser,
Bruppacher & Mosler, 2011).
Past research identifies several precautionary actions that users can take to protect
themselves against information privacy risks (Lwin, Wirtz, & Williams, 2007). The focus of
the current research is on the use of technologies as a tool to enhance information privacy.
Although the advent of new technologies is often believed to create new types of privacy risks,
privacy protection can also be implemented through technological innovations (Smith, Dinev,
PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 9
& Xu, 2011). Indeed, new privacy-enhancing technologies can help individuals to protect their
privacy by gaining more control over the flow of their data (Burkert, 1997). Critically, the
success and acceptance of a new privacy-enhancing technology largely depend on the extents
to which people perceive the value of the technology to protect their private information. The
privacy-enhancing technology may not be used when end-users perceive it as being ineffective
to protect the confidentiality of their information, think it too hard to use, or find it too
demanding to change their behaviors in accordance with the technology’s requirement.
Thus, the primary focus of this study is to understand how users perceive the value of a
technology that is designed for privacy protection by examining how much loss in usability,
performance, and money users are willing to sacrifice for privacy protection. Stated differently,
we aim to quantify the “privacy premium” that people are willing to incur for information
privacy. Importantly, since the value of privacy is related to the context in which concern for
privacy arises, we also attempt to examine how different contextual variables and individual
characteristics can influence the privacy premium.
Estimating the value of privacy protection
The primary research focus of the current study is to explore how people value their
information privacy by estimating their trade-offs for privacy protection. Since previous
research has shown that people often have multiple and conflicting priorities when they
evaluate a decision related to their information privacy and security (Hann, Hui, Lee, & Png,
2007; Workman, Bommer, & Straub, 2008; Tsai et al, 2011), we attempt to model how people
make trade-offs for privacy when their preference for high privacy conflicts with other
desirable objectives. We chose the trade-off methodology because this approach requires
decision makers to think hard about their personal values in order to make trade-offs for
PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 10
privacy. Indeed, scholars have argued that we can gain a better understanding of what people
want and value by examining how they make difficult choices (Keeney & Raiffa, 1976). In the
context of information privacy, by examining how people make trade-offs for privacy
protection, we can better understand and begin exploring the reasons why people engage or
disengage in privacy protective behaviors.
The current literature documents several attempts to quantify the value of information
privacy using a trade-off assessment methodology. In one of the earliest studies of this kind,
Milne and Gordon (1993) used conjoint analyses to study trade-offs in the context of direct
mail marketing. They found that consumers wanted to improve targeting efficiency and lower
mail volume, but they were not willing to pay for these improvements. In a more recent study,
Hann and his colleagues (2007) attempted to quantify the value that individuals ascribe to
website privacy protection. They found that U.S. subjects were willing to pay between $30.49
and $44.62 for protection against errors, improper access, and secondary use of personal
information. In addition, the authors found that convenience was a minor factor when
respondents evaluated web site privacy policies. These studies, however, do not address the
current research question on how people value a technology that is designed for information
security purposes.
In a different line of research, the trade-off paradigm is used to quantify the economic
values of personal information. Previous research has shown that people were willing to trade-
off their personal data for financial gains (Acquisti & Grossklags, 2005a; Hann et al., 2007),
personalization purpose (Chellappa & Sin, 2005), and for convenience and discounts
(Spiekermann, Grossklags, & Berend, 2001). In addition, a 2002 Jupiter Research study found
that 82% of online shoppers were willing to give personal information to new shopping sites in
PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 11
exchange for a chance to win $100, and 36% said they would allow companies to track their
web browsing behaviors for a $5 discount (Tedeschi, 2002). In a different study, Huberman et
al. (2007), using a second-price auction approach, found that individuals wanted more money
to reveal information that was “abnormal” or “undesirable”.
The studies discussed above focus on assessing how much people were willing to
accept (WTA) to disclose their information. The focus of the current study, nonetheless, is on
assessing how much people are willing to pay (WTP) or sacrifice for a technology to prevent
privacy violations. Research in psychology and economics attests that the two judgments,
WTA and WTP, are quite different (see Hanemann, 1991 for a review). In fact, Grossklags and
Acquisti (2007) have documented discrepancies between WTA and WTP judgments in the
context of information privacy. Consequently, drawing inferences about WTP values from
WTA values is not warranted.
In short, there is a dearth of research on assessing the value of information privacy
protection. Our purpose in the current study is to fill this research gap by examining the
“premium” that people are willing to sacrifice for privacy. Importantly, unlike previous
research, we take into account the fact that the preference for high privacy is often in conflict
with other objectives that people highly value. Therefore, the current study attempts to model
how people make trade-offs for privacy in a multi-attribute context where a privacy-related
decision problem involves multiple and conflicting objectives.
We chose to examine the value of privacy protection in the context of smartphone
technology; specifically the decision context of an individual choosing a smartphone to
purchase. This decision context was carefully chosen based on a number of reasons. First and
foremost, people are increasingly worry about the security of data stored in their mobile
PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 12
devices (Egelman, Felt, & Wagner, 2012; Chin, Felt, Sekar, & Wagner, 2012). Second, modern
smartphones possess similar capacities to personal computers and enable users to perform a
number of tasks, which also exposes the users to a wide array of privacy risks. Third, the
choice of the smartphone technology naturally set up the multi-attribute decision problem:
people have to make a number of value trade-offs to protect their (smartphone) privacy.
Within the hypothetical decision scenario, the encrypted smartphone helps the user
attenuate privacy risks, but it is more expensive and more difficult to use. The enhanced
encryption technology also impedes the overall performance of the device, such as limiting the
users’ access to a number of mobile applications and lengthening the processing time required
for transmission of data. The alternative smartphone is not encrypted, and it exposes users to a
higher level of privacy risks. Nonetheless, it is cheaper, more user-friendly, and does not affect
the overall performance of the device. Using this decision context, we assessed how people
make trade-offs between privacy and the following attributes: cost, usability, and performance.
We refer to the trade-offs that users are willing to make for privacy protection, e.g. paying
more for the encrypted smartphone, as privacy premiums
1
.
The effects of different privacy threats
The second aim of the current research is to understand the effect of different types of
privacy threat on the value of privacy protection. It has been suggested that the value of information
privacy should only be discussed once its context has also been specified (Acquisti, 2004). In
particular, scholars have argued that (concern for) privacy either depends on or derives from the
nature of its threats (Regan, 1995; Sheehan, 2002). Following this line of reasoning, we consider
1
The following terms are used interchangeably, the value of privacy (protection), trade-offs for privacy, and privacy
premiums
PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 13
how the value of privacy protection is conditional upon different types of privacy threat—a
contextual factor that has not been discussed in prior research.
We consider the type of privacy threat as possible consequences that can happen to the
victims of a data breach when their data are collected by different attackers. In reality, the type of
privacy threat, as defined in this study, is not unambiguous, and can have different behavioral
implications. People often have an obscured understanding on the nature and scope of a cyber threat.
There are several reasons for this effect. Security experts need time to verify the cyber threat before
they can confidently communicate the incidence to the public. As a result, consumers and users are
often the last ones to know about a cyber threat that has been going for a while, which can make it
difficult for them to understand the incidence. For instance, it took Target several weeks before they
notified their customers about the data breach incidence although Target had received early warnings
from federal investigators about the potential data breach.
Even more troublesome is the fact that reports on data breaches often emerge in the
mainstream media prior to the formal announcement of the incidence. This creates an additional
layer of complexity and confusion as people may receive conflicting, perhaps misinformed,
information about the nature and scope of the privacy incidence. Finally, the complex and technical
aspect of any cyber threat investigation may also prevent people from fully understand the nature of
the threat. As a result of these (and other) reasons, most people may find it hazy to understand who
collect their data and for what purpose, what may happen to their privacy, and what they can do to
prevent future threats.
Interestingly, the ambiguity of the context in which a cyber threat occurs can prompt people
engage in drastically different behaviors. Some people may be more likely to take cautious measures
to protect their privacy given the uncertainty of the situation while others simply do the opposite
PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 14
because they may perceive the incidence as a false alarm. Despite this important behavioral
implication, little is known about how people value their information privacy in the context where
the privacy threat is ambiguous or unspecified.
In table 1, we contrast different types of privacy threat. Across several dimensions, privacy
threat from an unknown source appears to be more ambiguous. For example, there is little
information on the motivation of the attackers, and the likelihood that the (ambiguous) attack
continues. This difference, we argue, can prompt people to behave differently when the cyber threat
is vaguely defined compared to when it is provided with a lot more specific details.
In fact, the Construal Levels Theory (CLT) (see Liberman, Trope & Wakslak, 2007 for a
review) predicts that the ambiguity of the situation in which a cyber threat arises can lead people to
highly value their privacy. The theory postulates that different perceptions of psychological distance
can influence individuals’ thoughts and behavior. People, the theory posits, engage in concrete
conceptualization of objects when they perceive the objects near in a psychological space, while the
conceptualization of the same objects are more abstract when the objects are perceived
psychologically remote. Interestingly, an abstract conceptualization of an object makes it appear
more valuable or more important than a concrete conceptualization of the same object.
In the current research context, an unspecified or unknown type of privacy threat can be
perceived more abstract or distal than a privacy threat that is specified or known. The term “privacy”
is notoriously difficult to define, and it has multiple meanings. People may construse “privacy” as a
concrete action of collecting unauthorized information, or alternatively, they may think of it as an
abstract fundamental right protected by the Constitution or a universally desirable or even a moral
value, at least in the West. Thus, when the context of privacy violation is ambiguous, it is likely for
people to contruse the meaning of privacy in an abstract manner
PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 15
On the other hand, when the context of privacy violation is specific, people can contruse the
violation and the meaning of privacy more concretely since the contextual information on the
incidence is accessible. People can understand why their privacy is violated, who the attacker is, and
what they can do to protect their privacy against future risks. As a result, and according to CLT, the
value of privacy protection should be higher when the type of privacy threat is unspecified than
when it is specified. Supports for this prediction come from studies on the effect of abstract and
concrete construal of moral values on behavioral intentions (Eyal, Liberman, & Trope, 2008; Eyal
et.al, 2009; Amit & Greene, 2012). For example, Eyal and her colleagues (2009) found that their
subjects were more likely to choose actions that align with their indicated values when the abstract
construal was evoked, but they chose actions that diverge from their values when the concrete
construal was evoked. However, there has been no attempt to apply CLT to explain for the effect of
the ambiguity and specificity of privacy threat on how people value the protection of their personal
information. Thus, we attempt to test this prediction in our study using an experimental procedure.
In addition, Table 1 also contrasts the different types of privacy threat when the nature of the
threat is known. These specified threats including marketing, government intrusion, hacking, and
snooping, represent a wide range of different kinds of privacy violations that people may encounter
in their daily lives. Snoopers, for example, generally collect their victims’ information to serve
interpersonal purpose such as personal revenge, while hackers almost exclusively focus on financial
gains. Marketers attempt to collect users’ data for marketing services while the government is
generally believed to use the collected information for national security purpose.
The differences between these four specified types of privacy threat, as depicted in table 1,
suggest that people may value their information privacy differently, depending on the specific
context in which their privacy is compromised. This suggestion raises an interesting generalizability
PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 16
hypothesis. On the one hand, if the value of privacy protection is found to be insensitive to different
privacy threat contexts, we can be confident that the value of privacy protection is generalizable. On
the other hand, if the value of privacy protection is contingent upon the privacy threat contexts, we
can begin learning how different contextual variables can influence the perception of privacy value.
Thus, in addition to compare the theoretical difference in the value of privacy protection between the
specified and unspecified types of privacy threat, we also explore the possible differences in the
value of privacy protection across the specific types of privacy intrusion to test the generalizability
hypothesis.
The effects of individual characteristics
The third aim of the current research is to understand the extent and nature of individual
differences in the privacy premium. The pioneer privacy researcher Westin (1991) proposed a
classification that categorizes people with privacy concerns into three groups: privacy
fundamentalist, privacy pragmatic, and privacy marginal. Those marginally concerned about
privacy are mostly insensitive to privacy protection; privacy fundamentalists, in contrast, view
privacy as a “protected” or “sacred” value (Baron & Spranca, 1997; Tetlock, 2003), unwilling
to compromise their private information. Between these two polarized groups are those hold a
pragmatic view on privacy. These people are concerned about privacy but they are willing to
trade personal data for benefits. This classification clearly implies that preference for privacy
protection is heterogeneous. Individuals with distinct concerns and preferences, obviously,
value privacy protection differently. For example, people with higher general concerns for
information privacy should value a technology that protects their personal information more
than people with a lower concern for privacy.
PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 17
Equally interesting are the potential relationships between demographic variables such
as sex, age, and political attitude and the value of privacy protection. While there is some
research on sex differences for privacy concerns (Pedersen, 1987; Bartel, 1999; Garbarino &
Strahilevitz, 2004), few researchers have examined the effects of age and political attitudes on
the value of information privacy. Therefore, our last research purpose is to explore how
demographic variables such as sex, age, and political attitude, as well as general privacy
concerns, predict the privacy premium that people are willing to pay to increase information
privacy.
PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 18
Chapter2. Modeling approach
Multi-attribute utility theory
The multiattribute utility theory provides a proper framework to study how people make
trade-offs for privacy. The backbone of the theory is to establish the exchange rates between
conflicting objectives. The term “objectives” reflects the decision maker’s aspirations or values that
he wants to achieve. Each of these objectives can be described by an attribute— a quantitative
indicator that measures how well an alternative helps the decision maker achieve his aspiration.
Since the decision maker often have multiple objectives, the respective attributes are combined in
the multiattribute value function to create a score that reflects the extents to which the alternative
meets the decision maker’s expectations.
The method to combine the attributes depends on how researchers presume the underlying
relationships between the attributes in the multiattribute value model. The additive assumption,
which requires that the decision maker’s preference for an attribute is independent of his preferences
for other attributes at different levels, is often used. Despite its simplicity empirical studies suggest
the additive form provides an excellent approximation to people’s value assessments (Winterfeldt &
Edwards, 1986). The additive form is mathematically expressed as:
V(X
1,
X
2...
, Xn)
=
(1)
where the capital “X” represents the attributes, lower case “x” indicates the value of the attribute in
the consequence space, and “w” is the scaling constant.
Of particular interest is the scaling constant parameter w. In most of real-world situations,
objectives are often in conflict with one another, i.e. the more secured a smartphone is, the more
expensive it is. The scaling parameter reflects these conflicts by normalizing the exchange rates
between attributes—the extents to which a unit increase in the value of an attribute is equivalent to a
PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 19
decrease in the value of another attribute
2
. For example, in the current study, one of the exchange
rates is the extents to which a decision maker is willing to pay more for a secured smartphone. The
exchange rates are usually normalized between 0 and 1.
One of the main focuses of the current research is to obtain the exchange rate between
privacy and other relevant attributes. Learning the exchange rates for privacy is important because it
indicates how a decision maker values their privacy in relations to other attributes. For this reason,
understanding the method to quantify the exchange rates between privacy and other relevant
attributes is critical. However, this cannot be accomplished without first discussing some of the
relevant cognitive biases that may occur during the elicitation procedure.
Cognitive biases in trade-off elicitation
The normative decision making model presumes that a preference should be insensitive to
the method from which it is elicited, a normative requirement known as procedural invariance.
However, both empirical research and experiences suggest that different trade-off elicitation
procedures yield different results. Hence, a sensible methodological approach is to select a procedure
that minimizes some of the cognitive biases that has been found to distort the decision maker’s
judgment such as the range insensitivity effect and the prominent effects.
The range insensitivity effect refers to the empirical observation that decision makers are not
sensitive enough to the range of an attribute when making trade-off judgments. The following
example illustrates how this effect unfolds. In Scenario 1, a decision maker considers purchasing a
smartphone, and his main concerns are cost and privacy. After doing some research, he learns that an
encrypted smartphone, which helps to keep his data secured, costs about $600, while the “regular”
smartphone costs about $500. In Scenario 2, the costs of the secured and the unsecured phones are
$800 and $300, respectively.
2
assuming the value functions for both attributes are monotonically increasing
PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 20
Comparing the two scenarios, it seems likely that the difference in cost in the second scenario
will have a greater impact on the decision maker’s deliberation than the difference in the first
scenario. Indeed, the decision maker should be sensitive to the range of the cost attribute, increasing
its “weight” when the range is large and decreasing it when the range is small, a normative principle
known as range sensitivity. However, empirical research shows that few people conform to this
normative requirement (von Nitzsch, & Weber, 1993).
Another type of cognitive bias that should warrant attention from researchers is the
prominence effect. The effect occurs when a decision maker favors an alternative that is high on the
most important attribute and underestimates weights for other attributes. For instance, many people
mainly concerns about minimizing the cost of a smartphone although they do have preference to
other objectives such as maximizing privacy. Without a proper systematic way of thinking, it is
likely that these decision makers assign an undue “weight” to the cost objective and neglect the
relative “importance” of the privacy. Studies have shown that decision makers incline to employ a
qualitative lexicographical ordering approach to select an alternative where they overweight the
relative importance of the attribute considered to be the most important (Tversky, Sattah, & Slovic,
1988). This is certainly undesirable because the trade-off method requires people to think
quantitatively, i.e. how much of an increase in X is worth a unite decrease in Y.
The “indifference procedure”
Tversky, Sattah, and Slovic (1988), in now a classic paper, proposed a method called
“matching in a context of choice experiment”. The following example, adapted from the original
paper, illustrates this method. Participants in a study were asked to select one of the two programs.
Number of causalities Cost
Program X 500 $55 millions
PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 21
Program Y 570 $12 millions
The participants were asked to choose between two options. However, prior to the decision, they
were asked whether the cost of program X at $55 million is too high or too low. Then they were
asked of the value that they think is appropriate. The advantage of this procedure is that it
decomposes the complex trade-off judgment into a simpler task of choosing between two options,
e.g. program X or program Y, while encouraging the decision maker to consider the options more
quantitatively. The authors argued and showed evidence that the matching-in-the-choice-context
helps to mitigate the prominence effect.
In the current study, we expanded the method used in Tversky et.al (1991). The elicitation
used in the current experiment requires a decision maker to choose between binary choices where
the two alternatives are compared along two attributes. Alternative A appears more attractive than
alternative B in the first attribute, e.g. A is cheaper than B, but the opposite is true in the second
attribute, e.g. B is high while A is low in privacy. A decision maker is asked to indicate which
option is more preferable.
The key difference between the Tversky’s procedure and the current method is the use of an
iterative procedure to locate the “indifference point”—the point at which a decision maker perceives
the two alternatives equally attractive. This key change in the procedure is important because it helps
to simplify the decision task even more. Instead of having decision makers think of the value that
makes the two options equivalent, as in the Tversky’s elicitation, we simplified the decision task by
having people repeatedly choose their preferred option from a set of iterative binary choices. This
simplification also helps decision makers to avoid the anchoring effect (Thorsteinson et.al, 2008)
since they no longer need to come up with an indifference value.
PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 22
The so-called “indifference approach” also eliminates the range-insensitivity bias. Recall that
to avoid making the range-insensitive bias, decision makers have to take into account the range of
each attribute when making trade-off judgments, a somewhat cognitive demanding task. Using our
procedure, however, decision maker do not have to know or think of the ranges of two compared
attributes at all as they are never mentioned to the decision makers.
Consistency check
The use of the exchange rate to learn about the value of information privacy also provides a
mean to examine the consistency in the respondents’ judgments. Consistency check is a standard in
the good practice of decision analysis. The process of completing consistency check can be
contrived, but it is generally completed by eliciting a redundant number of trade-off judgments. The
following paragraphs sketch the process of checking for decision makers’ consistency.
An alternative with n attributes requires the elicitation of n-1 trade-off judgments against a
unique attribute to derive the normalized exchange rate or “weight” for each of the n attributes. The
n-1 judgments create n-1 equations containing n attribute weights. The normalizing equation in
which the sum of the attribute weights is equal to 1 completes the system of equation of n attribute
weights. From this system of equation, the attribute weights can be calculated. When the number of
elicited trade-off judgments is greater than n-1, it provides additional information to check for the
decision maker’s internal consistency. In the essence, the additional information provides a different
system of equation to solve for n attribute weights. The decision makers, then, can compare his
consistency by checking the correlations between the attribute weights derived from the two
systems. Note that the consistency analysis is analyzed on the individual basis.
The following example concretizes the consistency check process. A decision maker is
considering to purchase a privacy-enhanced smartphone, and he mostly concerns with three
PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 23
attributes: privacy, cost, and usability. Supposing that a decision maker has established his exchange
rates between privacy and money and between privacy and usability, these two judgments in
addition to the fact that the sum of the attribute weights are equal to 1 are sufficient to calculate the
weights for all three attributes, privacy, cost, and usability. The obtained attribute weights are unique
to the trade-off judgments against the privacy attribute. In addition, the decision maker further
completes the trade-off between money and usability. The availability of the trade-off judgments
between money and privacy and money and usability establishes another system of equations (using
the normalizing equation again), which allows for the calculation of a different set of attribute
weights unique to the trade-off judgments against money. The decision maker can examine the
correlations between the weights from the two systems to gain further insights on his judgments.
PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 24
Chapter 3. Method
Experimental procedure
The experiment began with a three minute video describing the study and carefully
explaining how the attributes are defined. Each respondent was randomly assigned to one of the five
conditions where the types of privacy threat were manipulated: (1) governmental intrusion, (2)
marketing, (3) crime, (4) snooping, or (5) unspecified. Respondents completed all possible
assessment pairs including the five attributes with up to three binary choices per assessment (30 in
total). The order of the ten trade-off assessments was randomly determined. The experiment ended
with respondents completing several questions regarding their demographic information (sex, age,
and political attitude), and a standardized scale consisting of 12-item measuring privacy attitude and
concern (Karat, Karat, Brodie, Carolyn, & Feng, 2005). The standardized scale demonstrated good
reliability (Cronbach’s alpha = .844).
The experiment was hosted on Qualtrics.com and 232 respondents were recruited from
Amazon Mechanical Turk (AMT). Previous studies have shown that that AMT samples are
generally more representative than other convenience samples (Buhrmester, Kwang, & Gosling
2011; Mason & Suri 2012; Paolacci, Chandler, & Ipeirotis 2010). Sex was almost evenly distributed
in our sample in which 49.57% were female. Respondents reported an average age of 34. About 90%
reported that they were currently using a smartphone, and all respondents have used a smartphone at
some point in the past 10 years. 50.86% were classified as liberals, while 21.15% were
conservatives. The remained 27.99% considered themselves as having a “neutral” political attitude.
Attribute definitions
The first step of constructing the trade-off judgment for privacy is to explore users’ personal
values. We first brainstormed objectives that we believe that most smartphone users concern about.
PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 25
We then conducted informal interviews with the staff at our research center to verify whether our
brainstormed objectives map into some of the smartphone users’ concerns and explored other
possible objectives. The results yield four objectives: maximizing privacy, minimizing cost,
maximizing usability, and maximizing performance. Next, we defined attributes for these objectives
in the manner that is consistent with the standard recommendations (Eisenfuhr, Weber, & Langer,
2010).
Table 2 is the summary of the objectives as well as the corresponding measureable attributes.
Briefly, the objective minimizing cost means minimizing the amount of purchase cost as well as the
monthly payment that users have to pay. Maximizing ease of use (or usability) refers to the
capability of a device to minimize users’ efforts while achieving desirable end results. We defined
this objective as the time it takes to learn all functions in a new phone. The objective maximizing
performance is operationalized in two attributes: maximizing the number of mobile applications (or
apps) and minimizing the downloading/uploading time for a photo. Mobile apps are important
because they carry out a number of functions, and downloading and uploading photos (e.g. selfie)
becomes a very popular activity for many smartphone owners.
In particular, we chose encryption as an operationalization for (high) privacy. Encryption
protects data privacy by transforming plaintext into random sequences of letters (ciphertext).
Encryption can be used to encode transmission of data (file, text, photos, videos, etc), and only
authorized sender and receiver can open the data with appropriate key (symmetric encryption).
Encryption, indeed, is an important component of a privacy-focused ecosystem in which not only the
device but also the service is secured. This leads to the possibility that user have to make a number
of sacrifices for privacy when they decide to accept the encryption technology as part of a larger
privacy-enhancing smartphone ecosystem. For instance, Blackphone users do not have access to
PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 26
some mobile applications that the Blackphone considers as unsecured, and they have to pay more for
the monthly security subscription service (www.Blackphone.com). In addition, despite being built
upon the popular Android platform, the Blackphone is equipped with innovated security features that
may make it harder to use for some users.
Elicitation procedure
Figure 1 illustrates our elicitation procedure graphically. Considering the trade-off between
privacy and cost represented in a series of three binary-choice trials with two smartphone options, A
and Bi (i = 1... 7). Smartphone A is not encrypted, but the cost associated with option A is also low.
On the other hand, Bi is more expensive but data stored in smartphone B are encrypted. A
respondent is asked to choose either A or Bi. Depending on their choice in the first trial, the cost for
Bi is adjusted dynamically while the cost for A is fixed in the next trail; Bi+1 > Bi if respondents
choose Bi conversely, Bi+1 < Bi if A is chosen. The procedure is repeated until the respondent is
indifferent between the two options or he reaches the third trial.
The dependent variable, the value for privacy protection is determined by taking the
difference in cost between two options whenever the respondent indicates indifference. If the
respondent does not select the “indifference” option in any of the three trials, the trade-off value for
privacy protection is bounded using inequality determined from the three trials. For instance, if a
respondents selects A in the first trial, B2 in the second trial, and B3 in the third trail, it could be
inferred that the respondent is willing to pay between B6 and B1 (B6 < B1) dollars for an encrypted
smartphone. We referred to the quantitative difference between two options as incremental trade-off
value for privacy. Elicitations of privacy trade-off value in terms of other attributes, apps, usability,
and processing time follow the same procedure
Privacy threat manipulation
PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 27
To evaluate the effect of different types of privacy threat on trade-offs for a secured
smartphone, we manipulated the operationalization of the privacy attribute. Specifically, the privacy
definition comprises two parts. The first part defines high versus low privacy where high privacy
means a smartphone is encrypted and low privacy means a smartphone is not encrypted.
Respondents were told that encryption helps to protect their privacy by preventing unauthorized
parties from gaining access to their contact lists, apps, web search history, text messages, instant
messages, and emails. The second part is a specific example of how a secured smartphone can
protect users from privacy violations. In particular, different attacker identities and various possible
consequences were made salient in the example.
The attacker identity and the possible consequence of a privacy violation were vaguely
described in the unspecified threat condition. Indeed, respondents in this condition did not know who
the attacker is and what concrete consequences they may have to suffer due to a privacy breach. On
the other hand, in each of the other four specified threat conditions, respondents learned who the
attacker is, what they can do with the obtained personal information, and the potential harms
associated with the privacy breach. In particular, the following descriptions were used to manipulate
the type of privacy threat:
Unspecified: Privacy refers to the extent to which access is gained to your phone without
your consent.
Government: Privacy refers to the extent to which the government can gain access to
your phone without your consent. For example, law enforcement officers
can obtain access to your private text messages and phone calls if they
believe you pose a security threat, have expressed anti-government
sentiment, or are simply perceived as “suspicious”.
PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 28
Marketing: Privacy refers to the extent to which a marketer can gain access to your
phone without your consent. For example, pharmaceutical companies can
track your web browsing behavior and then send you advertisements for
sensitive pharmaceutical products (such as birth control) and medications
(such as antidepressants) that you personally do not want or want others to
know you have been researching.
Crime: Privacy refers to the extent to which a criminal can gain access to your phone
without your consent. For example, criminals can steal and sell your credit
card numbers, resulting in you having to regularly report fraudulent charges to
your credit card company and deal with ongoing concerns about identity theft.
Snooping: Privacy refers to the extent to which a friend or family member can gain
access to your phone without your consent. For example, there are apps
available for download for a fee that allow the friend or family member to
snoop on voice and text communications on your phone, and to spy on face to
face interactions using the phone’s microphone and camera.
Choice Presentation
Trade-off judgments were assessed by having respondents choose one of the two smartphone
options that were described along five attributes. Figure 2 is an example of how we presented trade-
off choices to respondents. In this example where the trade-off is between privacy and cost, the
descriptions of the attributes, speed, usability, and apps were depicted identically for the two options.
Respondents could choose either options or could be indifferent. If they were indifferent, the
assessment stopped. Otherwise, the quantity of cost in option 1 would be adjusted in the next trial.
PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 29
Chapter 4. Results
To address the first question on how smartphone users make trade-off for data privacy, we
plotted the cumulative distributions of the trade-off values for encryption against each of the other
four attributes, and reported the median trade-off values for privacy. To examine the question
whether the privacy trade-off is sensitive to different types of privacy threat, we used non-parametric
methods to compare values of privacy trade-offs across the experimental conditions. Binary logistic
Regression (BLR) analyses were conducted for the purpose of examining the effects of individual
differences on privacy trade-offs.
The Value of a Secured Smartphone
Table 3 shows the median trade-off values for an encrypted smartphone across the
experimental conditions. The results indicate that 50% of respondents were willing to pay an
incremental cost of between $37.5 and $125 per month for an encrypted smartphone. Half of the
respondents reported that they chose an encrypted smartphone despite they had to wait between 7
and 16 seconds longer to download/upload a photo. Similarly, to protect their privacy, half of the
sample was willing to forgo between 11 and 19 mobile apps out of the top 20 most popular
applications available for download. Lastly, half of the sample was willing to spend between 22.5
and 360 minutes of extra time to learn all features in an encrypted smartphone. Clearly, despite the
group differences, the median values indicate that respondents were willing to incur non-trivial
losses to protect their (smartphone) privacy. These results have an important implication because
they suggest that people are also willing to pay non-trivial “premium” to protect their privacy.
The use of the medians to represent the magnitude of the privacy trade-off values has the
disadvantage of disguising the individual differences among respondents. Indeed, the cumulative
plots of the four privacy trade-off assessments displayed in the Figure 3 unambiguously demonstrate
PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 30
the effect of individual differences on privacy trade-off values. The x-axes in all of the four plots are
the incremental trade-off value for privacy while the y-axes are the cumulative percentage. Note that
values under the x-axes are the seven “true” indifference points used in the elicitation procedure.
Different curves correspond to different experimental conditions. Each point on a curve reflects the
proportion of respondents who were willing to sacrifice an X amount in terms of cost, time, apps,
and usability for (high) privacy.
The lowest point on a curve (closest to the coordinate) represents the proportion of
respondents who were only willing to make sacrifice for privacy when the reduction in the other
attribute is less than the defined minimum threshold. Stated differently, these people showed little
interest in purchasing a secured smartphone. About 10% to 25% of respondents were only willing to
pay less than an additional 25 dollars per month to have their smartphone encrypted. Additionally,
between 4% and 12% of respondents found the value of encryption equivalent to the loss of only
fewer than two of the most popular mobile applications. Put differently, these people indicated
willingness to purchase the encrypted smartphone as long as they had to give up only one or none of
the mobile apps.
Similarly, between 4% and 20% respondents revealed willingness to make trade-off for
privacy so long as the reduction in processing time is less than 1 seconds. There is, however, a great
variation, in the proportion of people who were willing to trade usability for privacy. Specially,
about 4% of respondents in the crime, snooping, marketing, and unspecified conditions and 50% of
respondents in the government condition considered the value of an encrypted smartphone
equivalent to less than an additional 30 minutes of learning time.
On the other hand, the highest point on a curve (farthest from the coordinate) is the
proportion of respondents who were willing to make sacrifice for privacy when the reduction in
PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 31
other attributes is more than the defined maximum threshold. In other words, these respondents
perceived the value of privacy protection too “sacred” to give up, at least according to our specified
ranges. For example, about 6% to 9% of respondents were willing to pay more than an additional
$400 per month for an encrypted smartphone, and about 22% to 40% of respondents found the value
of an encrypted smartphone equivalent to an additional 16 seconds or longer of processing time. In
addition, 35% and 65% of respondents were willing to give up more than 17 out of the top 20 most
popular apps to secure smartphone data. Similarly, 33% to 45% of respondents indicated that they
were willing to commit spending more than 480 minutes to master all functionality in a new secured
smartphone.
The variations in the proportions of respondents who were willing to sacrifice a lot or a little
for privacy are consistent with the common finding in the literature on individual differences for
privacy concern. Recall that Westin, using survey research (1991), classified people with privacy
concern into three distinct groups: privacy fundamentalist, privacy marginals, and privacy
pragmatics. Using a different methodology, we also found that concern for privacy, as measured by
the value of privacy protection, was not uniformly distributed.
Importantly, using the trade-off paradigm, we were able to show that the extents to which our
respondents valued their information privacy were also dependent on the attributes that they had to
sacrifice for privacy protection. For instance, only 9% of the respondents were willing to go beyond
the defined maximum monthly payment to protect their privacy, while as much as 65% of
respondents were willing to sacrifice more than the defined maximum number of apps to secure their
smartphone data.
The effect of different types of privacy
PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 32
The cumulative plots in Figure 3 also graphically suggest the differential effect of the type of
privacy threat on privacy trade-off values. Indeed, the five curves, representing the five experimental
conditions, are not overlapped. Particularly, the curve to the right (lower curves) dominates the curve
to the left (higher curves), meaning that respondents in the dominating group were willing to make
greater sacrifices for privacy than respondents in the dominated group. Indeed, K-independent
sample Kruskal-Wallis tests detected the significant differences in the assessments between privacy
and apps and between privacy and usability, ps < .05.
However, the KW tests did not specify the differences between the experimental groups.
Thus, we used Mann-Whitney-Wilcoxon tests to uncover the distinct trade-off patterns between the
experimental conditions. In the trade-off between privacy and usability, we found that the
distribution of privacy trade-off values in the unspecified condition significantly dominated the
distribution of trade-off values in the government condition, W = 457, p < .001. However, the
distributions of privacy trade-off values in the snooping, crime, marketing, conditions were not
significantly differed from the unspecified group. Furthermore, privacy trade-off values in the
government condition were significantly dominated by the trade-off values in the crime group, W =
405, p < .001, snooping group, W = 51.5, p < .001, and marketing group, W = 74, p < .001. The
results provide a mixed support for our hypothesis that the value of privacy protection is higher
when the type of threat is unspecified compared to when it is specified. Consistence with our
expectation, respondents were more willing to sacrifice usability for privacy when they were
presented with an unspecified privacy threat as opposed to when they were presented with the
specified privacy threat of government intrusion. On the contrary, the value of privacy protection
was indistinguishable between the unspecified threat condition and the marketing, crime, and
snooping conditions.
PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 33
In the trade-off assessment between privacy and apps, Mann-Whitney-Wilcoxon tests
revealed the distinct trade-off patterns between the government and the snooping conditions versus
the unspecified condition. Specifically, the distribution of privacy trade-offs in the government and
snooping condition were significantly different from (dominated by) the unspecified group, W = 341,
p = .013, and W = 315.5, p < .05, respectively. The distributions of the privacy values in the crime
and marketing conditions, however, were not significantly different from the unspecified group.
Noticeably, the distribution of the number of apps sacrificed for an encrypted phone in the
government condition were almost significantly dominated by the distribution of the number of apps
sacrificed to protect ones from being spied, W = 190.5, p = .06.
Similar to the aforementioned results in the privacy-usability assessment, we obtained mixed
supports for our hypothesis on the difference in the value of privacy between the unspecified and
specified threat conditions. While the significant differences in the value of privacy protection
between the unspecified condition and the government and snooping conditions provided supports
for the hypothesis, the similarities in the distributions of trade-off values between the unspecified
condition and the specified marketing and crime negated the hypothesis. Interestingly, we also
discovered the difference in the trade-off value for privacy against the government and against
snooping. In particular, respondents appeared to be more likely to place a higher value of privacy
protection against snooping than against the government.
The results generally suggest two interesting discoveries. First there is modest evidence
indicating supports for the CLT prediction such that an ambiguous unspecified privacy threat can
increase the value of privacy protection. Second, privacy threat from the government appears to be
inconsequential as respondents indicated little willingness to make trade-offs to protect themselves
from governmental intrusion. Stated differently, respondents did not value privacy as much when
PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 34
their private information is collected by the government. This finding provide some supports for the
context-specific hypothesis of privacy such that the value of privacy protection is contingent upon
the specific types of privacy intrusion.
The effect of individual characteristics
We formed a set of four new dependent variables (DV), each corresponding to a specific
privacy trade-off; Group coded 0 represented those whose incremental trade-off values for privacy
were less than the medians (lower value of privacy protection), and group coded 1 represents those
whose incremental trade-off values were greater than the medians (higher value of privacy
protection). Since the main interest is to understand the effect of individual characteristics on value
of privacy, we used stepwise binary logistic regression (BLR) to control for the experimental effect.
Procedurally, we first regressed the experimental groups on the DV in each model. Next, we
regressed the following predictors: sex, age, political attitude (liberal, neutral, conservative), general
privacy concern.
The detailed results are displayed in Table 4. BLR analyses unveiled some interesting
individual differences. The odds for conservatives to pay more than the median cost for an encrypted
smartphone were 2.424 times significantly higher than for liberals, and a unit increase in age
(expressed in years) was significantly associated with an increase in the odds of willingness to
tolerate longer than the median processing time, with an odds ratio of 1.04. Perhaps the most
surprising finding is the fact that we did not find the effect of sex differences. This is in contrast to a
previous study which shows that females are less likely to disclose their personal information than
their male counterparts (Garbarino & Strahilevitz, 2004).
One of the possible interpretations is that the previous study did not control for general
online privacy concern. Since females tend to have higher concern for information privacy
PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 35
(Pedersen, 1987; Bartel & Sheehan, 1999), the effect of sex difference on privacy protection
behaviors can be mediated by this third variable. In fact, this is the case. We re-ran a series of
stepwise BLRs on the same dichotomized dependent variables where we included the sex variable in
step 1 and included the general privacy variable in step 2. The results showed that in the trade-off
between privacy and speed, while the sex variable predicted the group membership in step 1, this
effect was disappeared when the variable privacy concern was included, providing a support for our
speculation.
Interestingly, we found that respondents’ stated general privacy concern was predictive of
their trade-off behaviors. A unit increase in the score of the general privacy concern scale was
significantly associated with an increase in the odds of sacrificing for privacy. The relationship
between privacy trade-off and general privacy concern was very consistent; significant differences
were detected across all four models. The result on the effect of general privacy concern on the value
of privacy protection is somewhat surprising as the information privacy literature suggests that
stated privacy concern is not a reliable predictor of privacy-related behaviors. For instance, stated
privacy concern is not a predictor of actual privacy protective behaviors (Spiekermann, Grossklags,
& Berendt, 2001). However, our result suggests the validity of using general concern for privacy, as
measured by a reliable scale, can still be a good predictor of what people actually do with their
private information.
Consistency check results
The process of completing the consistency check was described in the previous section
(Modeling Approach). To calculate the attribute weights for privacy, money, wait time, apps, and
user-interface, we only need to elicit four trade-offs against privacy including privacy-money,
privacy-wait time privacy-apps, and privacy-user-interface. As described earlier, this allows for the
PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 36
derivation of the attribute weights unique to privacy trade-offs—referred hereafter as attribute
weights from the privacy set. However, we elicited ten different judgments, which enable us to
derive additional four sets of attribute weights unique to the judgments against money, wait time,
apps, and user-interface—referred hereafter as attribute weights from the money set, wait time set,
app set, and usability set, respectively. We then compared the attribute weights across the five
attributes and between different sets by examining their correlations.
The following steps were completed to transform the crude exchange rates, derived directly
from the respondents’ judgments, to the normalized weights. First the utility of achieving a specific
level of an attribute was derived as follows, assuming a linear utility function:
u (attribute) =
The worst and best trade-off values reflect the range of the attribute, while the trade-off value is the
specific value of the attribute in the trial that a respondent was indifferent. Furthermore, the
indifferent trials indicated that respondents found the two alternatives, which are varied in two
attributes at the time, equally attractive. Thus, we have:
*
+
*
=
*
+
*
where w
i
is the weight for each attribute, and u
i
and u
i ‘
are the utilities for each attribute for each
smartphone option. From a set of judgments against a unique attribute, e.g. privacy, we have:
+
+
+
+
= 1
=
=
=
=
Solving this system of equation yields the weights for the five attributes with respect to the
privacy judgments, as in this example. Repeating this process with respective to other attribute
PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 37
judgments result in four different sets of weights. Once all the attribute weights with respect to
different sets were obtained, we examined their correlations with each other. From a psychometric
standpoint, if the correlations are above 0.7, we considered this as the evidence for consistency.
Table 5 contains the medians of the correlations between any two of the five sets of weights.
As expected, there were differences in the correlations. The medians between some correlations
were higher than the others. The median correlations between attribute weights from the privacy set
and other sets (< 0.8) were smaller than the median correlations between attribute weights in the
non-privacy sets (> 0.8). In general, however, the minimum correlation is greater than 0.7, indicating
that judgment consistency was good.
PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 38
Chapter 5. Discussion
The current study attempts to assess how people value their information privacy in the
context of smartphone technology. We showed that our respondents were willing to forgo a number
of tangible benefits to protect their smartphone privacy. Interestingly, the value of smartphone
privacy was susceptible to a number of factors including the type of privacy threat and individual
characteristics. These findings not only greatly enhance our understanding on to what extents and
when people value their information privacy, but they also have practical implication for the design
of a usable cyber security ecosystem.
Public polls consistently indicate that American people concern about their information
privacy (Culnan & Milne 2001; TRUSTe, 2012).The next logical step, therefore, is to evaluate how
this concern for information privacy influences people’s judgments and behaviors related to privacy,
and what we can do to increase the safety and security of personal information stored in the cyber
space. On the one hand, several studies indicate that despite their stated privacy concerns, people
were willing to trade-off their personal data for financial gains (Acquisti & Grossklags, 2005a),
personalization purpose (Chellappa & Sin, 2005), and for convenience and discounts (Spiekermann
et al., 2001). These findings depict a pessimistic outlook on relying on user’s behaviors to protect
their privacy. On the other hand, empirical research also finds that people are also willing to pay for
privacy protection (Hann et.al, 2007). In a recent experimental, Tsai and her colleagues (2011) found
that some consumers were willing to pay a premium to purchase products from privacy protective
websites when privacy information was made more salient than when the information was
unnoticeable. This (and other) research suggests a possible solution to privacy such that privacy
protection can be achieved through the use of commercial privacy-enhancing products or
technologies.
PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 39
Our study directly dissected this possibility more thoroughly by quantifying the “premium”
that people may be willing to pay to protect their privacy in the context of smartphone technology.
We showed that (half of) our respondents were willing to pay a non-trivial $62.5 monthly premium
for privacy protection. More importantly, we also modeled how people valued their privacy in
relations to other desirable concerns. The data revealed that in addition to their willingness to pay
more for privacy protection, the respondents were also willing to sacrifice their preference for high
usability and performance to increase the protection of their private information.
From the information system perspective, our study contributes to a greater understanding on
how we can develop a more usable security procedure. Most information systems require users to be
mindful about the potential privacy risks, and exercise some forms of protective measures against
the cyber threats. However, behavioral research consistently demonstrates that people often give up
their privacy for other desirable concerns, e.g. to minimize inconvenience (Glassman,
Vandenwauver, & Tam, 2010). This consistent finding highlights the importance of integrating
user’s personal values in the design of cyber security system. Our research findings help to address
this issue by highlighting what people concern and how they traded off these concerns for privacy.
As a result, these empirical findings can be pragmatically valuable for the development of a more
usable privacy-enhancing security system.
Interestingly, how much people value their smartphone privacy not only depends on the types
of trade-offs, as discussed above, but also is susceptible to the type of privacy threat. Using CLT, we
predicted that the value of privacy protection should be higher when the type of privacy threat is
unspecified compared to when it is specified. We found partial supports for this prediction.
Respondents in the unspecified condition held a higher value of privacy protection than respondents
in the government condition when the value of privacy was quantified in terms of the loss of
PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 40
usability and the loss of mobile applications. The same pattern was found when comparing the value
of privacy protection between the unspecified and the snooping conditions in the privacy-app trade-
off.
CLT specifies the mechanism through which different perceptions of psychological distance
can influence judgments and behaviors. However, we did not measure the perception of
psychological distance, and used it as a covariate in our analyses. Therefore, we were unable to test
for the specific mediating relationship postulated in CLT. This is one of the limitations of the current
study that future research needs to address. On the other hand, albeit it is not plausible to
demonstrate that the relationship between different types of privacy threats and the value of privacy
is due to distinct perceptions of psychological distance, the data suggests that at least some of the
competing hypotheses to explain for this relationship are not warranted.
As an example, one of the possible accounts is that our respondents might mistakenly
interpreted the use of encryption against an unspecified privacy threat as if encryption had
safeguarded respondents from different types of privacy risks including privacy violations from the
government, marketers, family, and friends. Therefore, respondents behaved perfectly rationally as
they invested more in the type of resource that protect them from the largest pool of risks. This
account may explain for why respondents valued their privacy more when the type of threat is
unspecified compared to when it is specified. However, this account fails to explain for why the
trade-off values for privacy were indistinguishable between the unspecified and specified marketing
and crime conditions.
The other interesting finding is the indifferent attitude toward privacy violations by the
government. The data suggests that our respondents were willing to sacrifice very little to protect
themselves from having the government exploit their personal information. There is a large literature
PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 41
in behavioral science which suggests that people find acts that violate civil liberties and civil rights
more acceptable when they trust the government and when such acts are necessary to protect public
safety, especially after national disasters such as terrorism events (see McArdle, Rosoff, & John,
2012 for an example). Consequently, it is tempting to infer that our respondents were willing, if not
eager, to have the government violate their privacy because they trust the governing authority.
Nonetheless, our study was conducted a few months after the NSA scandal when public trust on the
government was very low (Newport, 2013). Thus, it does not seem reasonable to infer the role of
trust on the distinct pattern of trade-off values in the government condition.
There are at least two more plausible competing accounts. The first speculation is that our
respondents did not believe encryption or any types of privacy-enhancing mechanism can protect
themselves from the government. Our respondents were probably aware of the fact that the
government can request any businesses to submit their “keys” to break the encryption codes. Thus,
our respondents’ behaviors are rational—they should not pay for a useless technology to protect their
privacy. The other competing account is that our respondent did not concern about the government’s
intrusive behaviors. Perhaps our respondents might believe that they should not be afraid of having
the government violate their privacy because they do not commit any wrong-doing acts, at least in
the sense of damaging national security. Certainly, the current experiment is not designed to address
these competing hypotheses. Thus, additional research on this topic will ensure that the puzzle over
the indifferent attitude toward governmental intrusions is addressed.
Bansal, Zahedi, and Gefen (2008) reasoned that the term “context” in privacy research could
be related to the type or domain of the research construct (discipline), time (when), location (where),
occupation (who), culture (with whom), and rationale (why). Our current study contributes to the
PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 42
context-based privacy literature by suggesting the type of privacy threat as a new contextual variable
that can influence how people value their information privacy.
From a methodological point of view, the current research contributes a method that can be
easily adapted to elicit preference for information privacy. The common method to study
information privacy is the use of public poll. In fact, the use of public polls is favorable because it
offers a relatively quick access to understand how the general public thinks of important topics such
as information privacy, and it is easy enough to collect data from a large representative sample.
However, several scholars have criticized the use of public poll findings to constitute privacy
policies (Gandy, 2003; Bishop et al., 1986; Price & Neijens, 1998). In particular, there is evidence
showing that the use of conventional public poll is one of the possible reasons to create the puzzling
phenomenon known as the “privacy paradox” (see Beak, 2014 for a methodological account).
The method used in this study can be an alternative to the use of public polling. Similar to the
polling approach, the trade-off method, as demonstrated in this study, is relatively easy to be adapted
in a large sample. More importantly, the trade-off method can provide more meaningful information
about people’s preference for privacy. Indeed, while the polling approach can reveal people’s
general preference and concern for privacy, the trade-off method can quantify this preference by
establishing an exchange relationship between privacy and other desirable attributes.
The disadvantage of the “indifference procedure” is its laborious setup. In theory, one can
find a decision maker’s “true” indifference point by continuing to provide a binary choice until the
decision maker feel “indifferent”. However, in the context of an online experiment, this requirement
proves difficult as the researchers need to program a relatively large number of choice trials. For
example, had the procedure stopped at the fourth trial (instead of the third), we would have needed
15 possible trials per assessment (instead of seven). Thus, to achieve efficiency and reduce cognitive
PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 43
loads for our respondents, we stopped the iteration at the third trial. However, had we continued the
procedure beyond the third trial, we could have obtained more refined trade-off values (narrower
ranges). Hence, the effect of a longer elicitation procedure on trade-off values for privacy is
unknown. Future studies, therefore, are needed to address this question.
Research in information privacy is inherently tied into the developments of new
technologies. Our research follows this tradition by examining people’s perception of privacy
protection in the context of the emerging ubiquitous computing technology, of which smartphone is
an example. Further research on this topic will undoubtedly improve our insights on the
interconnection between technological development and information privacy.
PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 44
References
Amit, E., & Greene, J. D. (2012). You see, the ends don’t justify the means: visual imagery and
moral judgment. Psychological Science, 23, 861-868. doi: 10.1177/0956797611434965
Acquisti, A., & Grossklags, J. (2005). Privacy and rationality in individual decision making.
IEEE. Security & Privacy Magazine, 3(1), 26-33. doi:10.1109/MSP.2005.2
Acquisti, A. (2004). Privacy in electronic commerce and the economics of immediate
gratification. In Proceedings of the 5th ACM conference on Electronic commerce (EC
'04). New York, USA: ACM (pp. 21-29). doi: 10.1145/988772.988777
Baron, J., & Spranca, M. (1997). Protected values. Organizational Behavior and Human Decision
Processes, 70(1), 1-16. doi:10.1006/obhd.1997.2690
Bartel, K. (1999). An investigation of gender differences in on-line privacy concerns and resultant
behaviors. Journal of Interactive Marketing, 13(4), 24-38.
Bansal, G., Zahedi, F., & Gefen, D. (2008). The Moderating Influence of Privacy Concern on the
Efficacy of Privacy Assurance Mechanisms for Building Trust: A Multiple-Context
Investigation. In Proceedings of the 29th International Conference on Information Systems,
Paris, France.
Baek, M. (2014). Solving the privacy paradox: A counter-argument experimental approach.
Computers in Human Behavior, 38, 33-42. doi:10.1016/j.chb.2014.05.006
Bishop, G. F., Tuchfarber, A. J., & Oldendick, R. W. (1986). Opinions on fictitious issues: The
pressure to answer survey questions. Public Opinion Quarterly. 50(2), 240–250.
doi:10.1086/268978
PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 45
Burkert, H. (1997). Privacy-Enhancing Technologies: Typology, Critique, Vision. In P. E. Agre &
M. Rotenberg (Eds.), Technology and Privacy: The New Landscape (pp. 126-143). Boston,
USA: MIT Press.
Buhrmester, M., Kwang, T., & Gosling, S. D. (2011). Amazon’s mechanical turk: A new source of
inexpensive, yet high-quality data? Perspectives on Psychological Science, 6(1), 3-5.
doi:10.1177/1745691610393980
Chellappa, R. K., & Sin, R. G. (2005). Personalization versus privacy: An empirical
examination of the online Consumer’s dilemma. Information Technology and
Management, 6(2), 181-202. doi:10.1007/s10799-005-5879-y
Chin, E., Felt, A. P., Sekar, V., & Wagner, D. (2012). Measuring user confidence in
smartphone security and privacy. Proceedings of the Eighth Symposium on Usable Privacy
and Security
Culnan, M.J. & Milne, G.R. (2001). The Culnan-Milne survey on consumers & online privacy
notices: Summary of responses. In Interagency Public Workshop Get Noticed: Effective
Financial Privacy Notices. Washington, D.C.
http://www.ftc.gov/bcp/workshops/glb/supporting/culnan-milne.pdf.
Culnan, M. J., & Bies, R. J. (2003). Consumer privacy: Balancing economic and justice
considerations. The Journal of Social Issues, 59, 323–342.
Davis, F. D. (1989). Perceived usefulness, perceived ease of use, and user acceptance of information
technology. MIS Quarterly, 13, 319-340.
Davis, F. D., Bagozzi, R. P., & Warshaw, P. R. (1989). User acceptance of computer technology: A
comparison of two theoretical models. Management Science, 35, 982-1003.
doi:10.1287/mnsc.35.8.982
PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 46
Eyal, T., Liberman, L., & Trope, Y. (2008). Judging near and distant virtue and vice. Journal of
Experimental Social Psychology, 44, 1204-1209.
Eyal, T., Sagristano, M. D., Trope, Y., Liberman, N., & Chaiken, S. (2009). When values matter:
Expressing values in behavioural intentions for the near vs distant future. Journal of
Experimental Social Psychology, 45, 35-43.
Egelman, S., Felt, A. P. & Wagner, D. (2013). Choice Architecture and Smartphone Privacy: There's
a Price for That. In Böhme, R (Eds.), The economics of information security and privacy (pp
211-236). Berlin, Germany: Springer Berlin Heidelbergpringer.
Eisenführ, F., Martin, W., & Langer, T. (2010). Rational Decision Making (pp. 74-76). Berlin,
Germany: Springer Berlin Heidelbergpringer
Garbarino, E., & Strahilevitz, M. (2004). Gender differences in the perceived risk of buying online
and the effects of receiving a site recommendation. Journal of Business Research, 57(7). 768-
775. doi:10.1016/S0148-2963(02)00363-6
Glassman, M., Vandenwauver, M., & Tam, L. (2010). The psychology of password management: A
tradeoff between security and convenience. Behaviour & Information Technology, 29(3), 233-
244. doi:10.1080/01449290903121386
Gandy, O. H. (2003). Public opinion surveys and the formation of privacy policy. Journal of Social
Issues. 59(2), 283–299.
Gross, R. & Acquisti, A. (2005). Information revelation and privacy in online social networks. In
Proceedings of the 2005 ACM workshop on Privacy in the electronic society (WPES '05), New
York, USA: ACM (pp. 71-80). doi: 10.1145/1102199.1102214
PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 47
Grossklags, J. & Acquisti, A. (2007). When 25 cents is too much: An experiment on willingness-to-
sell and willingness-to-protect personal information. Unpublished manuscript, School of
Information, University of California at Berkeley, CA, USA.
Goldfarb, A., & Tucker, C. E. (2011). Privacy regulation and online advertising. Management
Science, 57(1), 57-71. doi:10.1287/mnsc.1100.1246
Hann, I., Hui, K., Lee, S. T., & Png, I. P. L. (2007). Overcoming online information privacy
concerns: An information-processing theory approach. Journal of Management Information
Systems, 24(2), 13-42. doi:10.2753/MIS0742-1222240202
Hern, A. (2014, September, 10). Information commissioner: “Apps are failing to respect user
privacy”. The Guardian. Retrieved from
http://www.theguardian.com/technology/2014/sep/10/information-commissioner-apps-failing-
user-privacy-information-facebook
Huberman, B. A., Adar, E., & Fine, L. R. (2005). Valuating privacy. IEEE Security & Privacy
Magazine, 3(5), 22-25. doi:10.1109/MSP.2005.137
Karat, J., Karat, C., Brodie, C., & Feng, J. (2005). Privacy in information technology: Designing to
enable privacy policy management in organizations. International Journal of Human-Computer
Studies, 63(1-2), 153-174. doi:10.1016/j.iojhcs.2005.04.011
Keeney, R. L,, & Raiffa, H. (1993). Decisions with multiple objectives: Preferences and value
tradeoffs. New York, USA: Cambridge University Press
Liberman, N., Trope, Y., & Wakslak, C. (2007). Construal level theory and consumer behavior.
Journal of Consumer Psychology, 17(2), 113-117. doi:10.1016/S1057-7408(07)70017-7
PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 48
Lwin, M., Wirtz, J., & Williams, J. D. (2007). Consumer online privacy concerns and
responses: A power–responsibility equilibrium perspective. Journal of the Academy of
Marketing Science, 35(4), 572-585.
Mason, W., & Suri, S. (2012). Conducting behavioral research on amazon's mechanical turk.
Behavior Research Methods, 44(1), 1-23. doi:10.3758/s13428-011-0124-6
Hanemann, M. (1991). Willingness to pay and willingness to accept: How much can they differ?
American Economic Review, 81, 635-687
McArdle, S. C., Rosoff, H., & John, R. S. (2012). The dynamics of evolving beliefs, concerns
emotions, and behavioral avoidance following 9/11: A longitudinal analysis of representative
archival samples. Risk Analysis, 32(4), 744-761. doi:10.1111/j.1539-6924.2012.01814.x
Moser, S., Bruppacher, S. E., & Mosler, H. (2011). How people perceive and will cope with risks
from the diffusion of ubiquitous information and communication technologies. Risk Analysis,
31(5), 832-846. doi:10.1111/j.1539-6924.2010.01544.x
Milne, G. R., & Gordon, M. E. (1993). Direct mail privacy–efficiency trade-offs within an implied
social contract framework. Journal of Public Policy and Marketing, 12(2), 206–215.
Newport, F. (2013). Americans disapprove of government surveillance programs. Retrieved from
http://www.gallup.com/poll/163043/americans-disapprove-government- surveillance-
programs.aspx
Paolacci, , G., Chandler, , J., & Ipeirotis, , P.G. (2010). Running experiments on amazon
mechanical turk. Judgment and Decision Making, 5(5), 411-419.
Pedersen, D. M. (1987). Sex differences in privacy preferences. Perceptual and Motor Skills, 64(3),
1239-1242. doi:10.2466/pms.1987.64.3c.1239
PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 49
Price, V., & Neijens, P. (1998). Deliberative polls: Towards improved measures of
‘‘informed’’ public opinion? International Journal of Public Opinion Research. 10(2),
145–176.
Regan, P. M. 1995. Legislating Privacy: Technology, Social Values, and Public Policy, Chapel
Hill, USA: University of North Carolina Press.
Sheehan, K. B., & Hoy, M. G. (2000). Dimensions of Privacy Concern among Online Consumers.
Journal of Public Policy and Marketing, 19(1), 62-73.
Spiekermann, S., & Grossklags, J., & Berendt, B. (2001). E-privacy in 2nd generation E-commerce:
privacy preferences versus actual behavior. In Proceedings of the 3rd ACM conference on
Electronic Commerce (EC '01). New York, USA: ACM (pp. 38-47).
doi:10.1145/501158.501163 http://doi.acm.org/10.1145/501158.501163
Shostack, A. (2003, May). Paying for privacy: Consumers and infrastructures. Paper presented at
the 2nd Annual Workshop on Security and Economic Security, Maryland MD, USA.
Smith, J., Dinev, T., Xu, H.(2011). Information privacy research: An interdisciplinary review. MIS
Quarterly. 35(4), 989-1015.
Tversky, A., & Kahneman, D. (1991). Loss aversion in riskless choice: A reference dependent
model. Quarterly Journal of Economics, (107), 1039-1061.
Thorsteinson, T. J., Breier, J., Atwell, A., Hamilton, C., & Privette, M. (2008). Anchoring effects on
performance judgments. Organizational Behavior and Human Decision Processes, 107(1), 29-
40. doi:10.1016/j.obhdp.2008.01.003
Tetlock, P. (2003). Thinking the Unthinkable: Sacred Values and Taboo Cognitions. Trends in
Cognitive Sciences, 7, 320–24.
PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 50
Tedeschi, B. (2002, June 3). E-Commerce Report; Everybody talks about online privacy, but few do
anything about it. The New York Times. Retrieved from
http://www.nytimes.com/2002/06/03/business/e-commerce-report-everybody-talks-about-online-
privacy-but-few-anything-about-it.html
Tsai, J., Egelman, S., Cranor, L., & Acquisti, A. (2011). The effect of online privacy information on
purchasing behavior: An experimental study. Information Systems Research, 22(2), 254-268.
doi:10.1287/isre.1090.0260
TRUSTe (2012). Consumer privacy index-Q1. http://www.truste.com/consumerprivacy-index Q1-
2012/.
Tversky, A., Sattah, S., & Slovic, P. (1988). Contingent weighing in judgment and hoice.
Psychological Review, 95, 371-384.
Utz, S., & Kramer, N. (2009). The privacy paradox on social network sites revisited: The role of
individual characteristics and group norms. Cyberpsychology: Journal of Psychosocial Research
on Cyberspace,3(2)http://cyberpsychology.eu/view.php?cisloclanku=
2009111001&article=2
von Nitzsch, R., & Weber, M. (1993). The effect of attribute ranges on weights in multiattribute
utility measurements. Management Science, 39(8), 937-943. doi:10.1287/mnsc.39.8.937
Westin, A.F. (1991). Harris-Equifax Consumer Privacy Survey 1991. Atlanta: Equifax, Inc.
Workman, M., Bommer, W. H., & Straub, D. (2008). Security lapses and the omission of
information security measures: A threat control model and empirical test. Computers in Human
Behavior, 24(6), 2799-2816. doi:10.1016/j.chb.2008.04.005
PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 51
Table 1. Different types of privacy threats
Different types of privacy threat
Threat Likelihood Consequences Control Intention
Unspecified Varied Varied Varied Unknown
Government
Surveillance
Unlikely
Possible disclosure of sensitive
information
Little National security
Marketing Very Likely Annoyance with ads Little Targeted marketing
Acquaintance
Snooping
Maybe Psychological harms Somewhat Interpersonal reasons
Criminal Hacking Likely Monetary loss Somewhat Finanical benefits
Table 2.
Objectives and attributes presented to respondents
Objectives Attributes
Monthly
Payment
Learning Time
Processing Time
Top 20 Mobile
Applications
Encryption Maximizing
Privacy
“Low privacy” means the contents of your phone and your communications are NOT
encrypted. The information affected includes, but is not limited to, contact lists, apps, and
web search history, as well as your text messages, instant messages, and emails. “High
privacy” means that the contents of your phone and your communications are encrypted.
Definition
Maximizing
Usability
Ease of interface refers to the amount of time in terms of minutes spent by a smartphone
user to set up and fully understand and control all of the functions of their new phone.
Speed refers to the number of seconds, on average, a smartphone takes to upload or
download a standard photo on the Web using a standard Internet connection. Uploading or
downloading other pieces of information is assumed to be proportional in the time required.
Maximizing
Performance
Applications or Apps refer to the top 20 most popular apps available for download on your
smartphone. This includes apps currently on your phone and apps you may want to
download in the future. You can expect that for certain phones not all the same apps will be
Minimizing
Cost
Monthly cost refers to the dollar amount paid each month for phone service and the
smartphone. We assume that instead of paying upfront for the phone, payment is set up as
monthly payments over two years added to your monthly phone bill.
PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 52
Table 3.
Median trade-off values for an encrypted smartphone
Conditions
Cost
(dollars)
Speed
(min)
Number of
App Loss
Loss of
Usability
(min)
Unspecified 125 16 16 360
Government 37.5 7 11 22.5
Marketing 62.5 10 16 360
Crime 62.5 12 19 360
Snooping 62.5 12 14 360
Running head: PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 53
Binary logistic regression results from regressing the experimental groups, political atttiude, age, and sex on the dependent variables
Predictors
B (SE ) OR B (SE ) OR B (SE ) OR B (SE ) OR
Government vs
Control
-.963(452) .382* -.935(.464) .392* -.579(.448) .561 -3.440(.792) .032**
Stalking vs Control -1.505(.469) .222* -.425(.438) .654 -.572(.447) .564 -.153(.433) .858
Crime vs Control -1.406(.466) .245* .272(.434) 1.312 -.196(.446) .822 .098(.441) 1.103
Market vs Control -1.148(.443) .317* -.425(.424) .654 -.282(.429) .754 .018(.422) 1.018
Conservative vs
Liberal
.885(.344) 2.424* .208(.339) 1.231 .100(.344) 1.105 .007(.363) 1.007
Moderate vs
Liberal/Conservative
.228(.689) 1.254 .489).694) 1.631 1.019(.698) 2.771 -.292(.733) .746
Sex .154(.292) 1.167 -.036(.292) .964 .412(.290) 1.510 .011(.307) 1.011
Age -.011(.013) .989 .022(.013) 1.023 .040(.014) 1.040* .012(.014) 1.012
.723(.233) 2.061** .565(.232) 1.759* .558(.231) 1.747** .733(.253) 2.082*
Model significance
Table 4
Notes: Reference groups: Control, Liberal, Liberal/Conservative, Male. * p < .05; ** p < .001
Encryption v Cost
Model
Encryption v Apps
Model
Encryption v Speed
Model
Encryption v Interface
Model
Planned Contrast
Political attitude
Online privacy concern
χ2 (9) = 32.570, p < .001 χ2 (9) = 23.022, p = . 006 χ2 (9) = 29.495, p = . 001 χ2 (9) = 61.775, p < . 001
Running head: PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 54
Table 5.
Medians of correlations between the attribute set judgments
Privacy Cost Apps Speed Usability
Privacy
Cost 0.74
Apps 0.80 0.80
Speed 0.73 0.82 0.82
Usability 0.77 0.82 0.87 0.82
PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 55
Figure 1: Hypothetical trade-off between privacy and cost
Figure 2: A screenshot of our choice presentation
Option 1 Features Option 2
High (encryption) Privacy Protection Low (no encryption)
$175 Monthly Cost $75
Good Number of Apps Good
Good Speed Good
Easy to use Ease of Interface Easy to use
Running head: PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 56
Figure 3: Cumulative distributions of the trade-off values for an encrypted smartphone. The following color codes representing the
experimental condition: Green = Government, Purple = Marketing, Turquoise = Snooping, Red = Crime, and
Blue = Unspecified threat
0
0.2
0.4
0.6
0.8
1
25.0 50.0 75.0 100.0 150.0 200.0 400.0
Incremental values of encryption in dollars
Encryption vs Cost
0
0.2
0.4
0.6
0.8
1
30 60 90 120 180 240 480
Incremental values of encryption in terms of loss in
usability
Encryption vs Usability
0
0.2
0.4
0.6
0.8
1
2 5 7 10 12 15 17
Incremental values of encryption in terms of the
number of apps lost
Encryption vs Apps
0
0.2
0.4
0.6
0.8
1
1.0 2.0 3.0 4.0 6.0 8.0 16.0
Incremental values of encryption in terms loss of
processing time (minutes)
Encryption vs Speed
PRIVACY VALUE IN A MULTI-CRITERIA DECISION PROBLEM 57
Abstract (if available)
Abstract
Increasing interest in technology to enhance online information privacy makes research on understanding how people value their privacy a priority. The current research aims to quantify the value of privacy protection in the context of smartphone technology by estimating the trade‐offs that people are willing to make for privacy protection. We also investigated how the value of privacy protection varied by contextual variables and individual characteristics. To do so, we adapted a paradigm proposed by Tversky, Sattah, and Slovic (1988) to examine the value of privacy protection by estimating trade‐offs for an encrypted smartphone. The results show that respondents were willing to pay non‐trivial premiums for smartphone privacy protection. Interestingly, ambiguously describing a privacy threat increased the value of privacy protection compared to depicting the threat with more contextual detail. We also observed the effects of general privacy concern, age, and self‐reported political attitude on the value of privacy protection. These results demonstrate the importance of accounting for user values in the design of a usable security system.
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
Comparing dependent groups with missing values: an approach based on a robust method
PDF
Rasch modeling of abstract reasoning in Project TALENT
PDF
Family conflict, negative mood, and adolescents' daily problems in school
PDF
Sources of stability and change in the trajectory of openness to experience across the lifespan
PDF
A functional use of response time data in cognitive assessment
PDF
Behabioral and neural evidence of state-like variance in intertemporal decisions
PDF
A Maxent-based model for identifying local-scale tree species richness patch boundaries in the Lake Tahoe Basin of California and Nevada
PDF
Syntheses of a series of fluorous amphiphiles and modulation of the relaxivity of gadolinium(III)-based contrast agents
PDF
Homegrown violent extremism: designing a community-based model to reduce the risk of recruitment and radicalization
PDF
Marital matching in West Africa: an examination of interethnic and interreligious first marriages in Benin
PDF
Experience-dependent neuroplasticity of the dorsal striatum and prefrontal cortex in the MPTP-lesioned mouse model of Parkinson’s disease
Asset Metadata
Creator
Nguyen, Kenneth D.
(author)
Core Title
Sacrificing cost, performance and usability for privacy: understanding the value of privacy in a multi‐criteria decision problem
School
College of Letters, Arts and Sciences
Degree
Master of Arts
Degree Program
Psychology
Publication Date
07/18/2016
Defense Date
07/18/2015
Publisher
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
construal levels theory,OAI-PMH Harvest,privacy premium,trade‐off preference
Format
application/pdf
(imt)
Language
English
Contributor
Electronically uploaded by the author
(provenance)
Advisor
John, Richard S. (
committee chair
), Dehghani, Morteza (
committee member
), McArdle, John J. (
committee member
)
Creator Email
hoangdun@usc.edu,kennethnguyen1004@yahoo.com
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-c3-600303
Unique identifier
UC11299893
Identifier
etd-NguyenKenn-3646.pdf (filename),usctheses-c3-600303 (legacy record id)
Legacy Identifier
etd-NguyenKenn-3646-0.pdf
Dmrecord
600303
Document Type
Thesis
Format
application/pdf (imt)
Rights
Nguyen, Kenneth D.
Type
texts
Source
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Access Conditions
The author retains rights to his/her dissertation, thesis or other graduate work according to U.S. copyright law. Electronic access is being provided by the USC Libraries in agreement with the a...
Repository Name
University of Southern California Digital Library
Repository Location
USC Digital Library, University of Southern California, University Park Campus MC 2810, 3434 South Grand Avenue, 2nd Floor, Los Angeles, California 90089-2810, USA
Tags
construal levels theory
privacy premium
trade‐off preference