privacy economics

  • MARKET - Economics - PRIVACY - Introduction and Basics

    MARKET - Economics - PRIVACY - Introduction and Basics

     

    Introduction to the Economics of Privacy

    The economics of privacy applies tools of economics to the analysis of privacy problems. These problems can either be related to a generic form of privacy or to personal privacy. Privacy is generic if it is related to an imbalance in the information distribution or publicity of the data, i.e. one market participant holds information in private that the other does not have. Privacy is personal if it is linked to one specific individual who is identified or identifiable.


    Personal privacy relates to discrimination power of information, i.e. to the power of singling an identifiable of identified individual out of an anonymous mass. For personal privacy it is important that individuals hold private information that is connected to their identity (Jentzsch et al. 2012).


    And conversely anonymity is the state of not being identifiable within a set of subjects, the so-called anonymity set (Pfitzmann and Köhntopp 2000).


    The economics of privacy focuses on incentives and actions of firms and consumers with respect to personal data. At the core are the positive or negative welfare effects arising from the disclosure of personal data. Privacy economics focuses on the trade-offs of actors, their strategies, as well as market outcomes and market failures.


    The research field also includes questions in competition if firms start to personalize products or services and/or prices, while facing consumers that are heterogeneous in privacy preferences. The economic impact of government regulation is analysed as well.

     


    Personal Data: A Peculiar Good?

    Personal data could be a peculiar good, because the combination of certain characteristics lead to complex economic problems. Compared to traditional goods personal data has been described as 'intangible asset' (OECD 2013: 10). It consists of the following properties:

     

    • Intangibility: personal information is not bound to a specific medium, but can be stored in different media;

     

    • Divisibility: personal information can be shared (i.e. two or more persons may hold the same piece on information);

     

    • Non-rivalry: If one person consumes the information, the informational content is not reduced and another person can consume it as well. Information is not a scarce resource in the traditional sense, but the material it is bound to is scarce;

     

    • Non-excludability: Once information is produced (collected), it is difficult to perfectly exclude others from using it;

     

    • Identity-relation: Personal information reveals either completely or partially the (psychological) identity. It then introduces psychological effects that alter the utility function of individuals compared to the standard utility under anonymity; and

     

    • Information externalities: The combination of different pieces of information can give rise to inferences (about income, intelligence, etc.). Moreover, externalities exist, where the revelation of others impact on an individual’s welfare.

     

    These properties give rise to a number of problems once information is traded in a market environment.

     


    The Legal Concept of Personal Data

    The current legal definition of personal information is stated in Article 2 of the EU Data Protection Directive:


    (...) 'personal data' shall mean any information relating to an identified or identifiable natural person ('data subject'); an identifiable person is one who can be identified, directly or indirectly, in particular by reference to an identification number or to one or more factors specific to his physical, physiological, mental, economic, cultural or social identity;
    This definition clearly states that personal data must be related to an identified or identifiable natural person.

     

     

    The Economic Concept of Personal Data


    The state of personal privacy arises with an asymmetric distribution of personal data between market participants, where one side privately holds personal information. Privacy is therefore a relationship of asymmetric distribution of personal data between market players.

     

  • MARKET - Economics - PRIVACY - Preference Measurement

    MARKET - Economics - PRIVACY - Preference Measurement

     

    Measuring the Preference for Privacy

    Measuring preferences for privacy is a complicated task. One of the key reasons is that the very act of asking a question about privacy can raise concerns in individuals, because the question primes on the problem. Another challenge is that privacy concerns seem to depend largely on the context in which they are elicited. Figure 1 shows that there are different kinds of transactions that raise different kinds of privacy concerns. For basics and an introduction to the economics of privacy, click here.

     

    Fig. 1 Types of Transactions

      typesoftransactions

    Source: Jentzsch (2015).

     

    In the following, we provide a brief overview of the current research related to preference measurement. This overview is by no means complete, but it provides the reader with some knowledge guideposts for a start. Why is this important? Privacy preference elicitation mechanisms play an important role in market research (e.g. are there enough privacy-sensitive buyers to reach break-even?) and the development of new products (is there a demand for a specific privacy product?). 

     

    Unfortunately, research shows that the instruments for measuring privacy concerns are empirically not robust. This means that they seem to contain little information that is predictive for the actions of the very same individuals (an observation that has been termed ‘Privacy Paradox’). One of the reasons why this assumed paradox might exist is that the elicitation mechanism is not neutral, but priming individuals and therefore renders artificially high levels of concern. This is a question debated in the literature.

     

    At the highest level, two types of methods of privacy preference elicitation can be differentiated:

    • Question-based methods
    • Action-based methods

     

    Question-based Privacy Concern Elicitation

    The most common method observed in the research literature is the question-based method, maybe because it is the most easy to implement. The majority of authors use questions on personal data use and processing and scale the answers by a Likert-scale (“How strongly do you agree to the following statement …”), measuring the strength by a scale of 1 to 7, for example.

    There are a number of authors who propose a series of questions (Buchanan et al. 2006; Malhotra et al. 2004; Smith et al. 1996, Stewart et al. 2002). Table 1 explains in detail how they work). This is necessarily only a limited introduction to privacy preference measurement; a general overview is provided by Preibusch (2013).

    One of the problems is that the developers of these methods do not test whether the stated preferences do in fact associate with real behavior of the very same individuals.

     

    Table 1 Measurement of Privacy Concerns: Question-based Methods

    Authors Year Measurement Approach
    Malhotra et al. (2004) 2004 IUIPC: Multidimensional notion of Internet Users data Privacy Concerns (IUIPC) The measurement instrument recognizes multiple aspects of data privacy: (i) attitudes toward the collection of personal data; (ii) control over personal data; and (ii) awareness of privacy practices of companies gathering personal data as being components of a second-order construct they label IUIPC. All of these aspects still lie within the domain of informational privacy.
    Buchanan et al. (2006) 2006

    3 Internet-admininstered scales measuring  privacy concern

    In the first study there were several people who completed an 82-item questionnaire from which three scales were derived. Then the correlations between the scores on the current scales and two established measures of privacy concern were examined.
    Smith et al. (1996) 1996 Concern for Information Privacy Scale 15-item instrument to measure individuals’ concern regarding organizational practices. It identified four factors—collection, errors, secondary use, and unauthorized access to information as dimensions of an individual’s concern for privacy.
    Stewart et al. (2002) 2002 Re-evaluation of: Concern for Information Privacy Scale Study examines the factor structure of the concern for information privacy (CFIP) instrument posited by Smith et al. (1996). The results suggest that each dimension of CFIP is reliable and distinct. However, CFIP may be more parsimoniously represented as a higher-order factor structure rather than a correlated set of first-order factors.

     Source: Jentzsch (2015).

     

    Multidimensional Notion of Privacy Concerns

    Internet Users' Information Privacy Concern (IUIPC): This measurement instrument recognizes multiple aspects of data privacy: (i) attitudes toward the collection of personal data; (ii) control over personal data; and (ii) awareness of privacy practices of companies gathering personal data as being components of a second-order construct they label IUIPC. All of these aspects still lie within the domain of informational privacy.

    Buchanan et al. (2006) suggest 3 Internet-administrated scales for measuring the privacy concern. In the first study, there were several people who completed an 82-item questionnaire from which three scales were derived. Then the correlations between the scores on the scales and two established measures of privacy concern were examined.

    Smith et al. (1996) develop a Concern for Information Privacy (CFIP) Scale, which is a 15-item instrument to measure individuals’ concern regarding organizational practices. It identified four factors—collection, errors, secondary use, and unauthorized access to information as dimensions of an individual’s concern for privacy. Stewart et al. (2002) do a re-evaluation of the Concern for Information Privacy Scale and examine the factor structure of the concern for information privacy (CFIP) instrument posited by Smith et al. (1996). The results suggest that each dimension of CFIP is reliable and distinct. 

     

    Action-based Privacy Concern Elicitation

    Other studies use action-based instruments to observe concerns, such as the number of personal data items disclosed or cookie usage practices. In the latter, the subject is asked about cookie settings.

    It is an important area of future research to develop more robust methods for privacy elicitation.

     

     

     

    IPACSO Publications

    Jentzsch, N. (2015) State-of-the-Art of the Economics of Cyber-Security and Privacy, IPACSO - Innovation Framework for ICT Security Deliverable 4.1. (download).

     

    Further Related Publications

    Buchanan, T., P. Schofield, C.B., Joinson, A.N. and U.R. Reips (2006). Development of measures of online privacy concern and protection for use on the Internet, Journal of the American Society for Information Science and Technology 58: 157 – 165.

    Goldfarb, A., and C. Tucker (2012). Shifts in Privacy Concerns. American Economic Review, 102(3): 349-53.

    Jentzsch, N., A. Harasser, S. Preibusch (2012). Monetising Privacy – An Economic Model of the Pricing of Personal Information, ENISA Report, Greece.

    Jentzsch, N. (2014). Auctioning Privacy-Sensitive Goods: A note on Incentive-Compatibility, in B. Preneel and D. Ikonomou (eds.), Privacy Technologies and Policy, Lecture Notes in Computer Science, Vol. 8450, pp. 133-142.

    Kumaraguru, P. and L.F. Cranor (2005). Privacy Indexes: A Survey of Westin's Studies CMU-ISRI-05-138 December 2005 CMU-ISRI-05-138.pdf Malhotra, N. K., Kim, S. S. and Agarwal, J. (2004). Internet Users' data Privacy Concerns (IUIPC): The Construct, the Scale, and a Causal Model, Data Systems Research 15(4): 336-355.

    Preibusch, S. (2013). Guide to measuring privacy concern: Review of survey and observational instruments, International Journal of Human-Computer Studies 71 (12): 1133–1143.

    Smith, J.H., Milberg, S.J., & Burke, S.J. (1996). Information privacy: Measuring individuals concerns about organizational practices, MIS Quarterly 20, 167–196.

    Stewart, K. A. and A.H. Segars (2002). An empirical examination of the concern for information privacy instrument. Information Systems Research 13, 36–49.

     

    Return to MARKET

  • MARKET - Economics - Privacy - Privacy Metrics

    MARKET - Economics - Privacy - Privacy Metrics

     

    Privacy Metrics

    The recent development of privacy metrics – metrics that quantitatively capture privacy-related aspects in a firm – challenge the general assumption among legal scholars that privacy and privacy measures are not quantifiable. The drive for quantitative measures is partially due to the increased pressure of data protection officers in firms to justify their budgets, but also due to the need for a measurement of effectiveness of specific measures. Within the research field privacy metrics are a methodological advancement.The main goal of quantification is to make privacy (aspects) in firms measurable and comparable. Quantification also allows inter-temporal comparisons and trend analysis. 

     

    Privacy metrics are related to two main areas:

    • Key performance indicators used by firms or by policy makers; and
    • Algorithms that are related to the sensitivity of data in a given set

     

    This section discusses privacy metrics as performance indicator in a “return on investment” context. The selection of relevant metrics must be based upon the strategic goal of the firm (such as effectiveness measurement).


    There are by now a number of examples of key performance indicators to capture of privacy-relevant matters, i.e. number of data security incidences, the number of privacy impact assessments conducted in a company, the number of lost or stolen records, etc. Two examples are the privacy risk exposure as well as the return on privacy investment indicator (see also Jentzsch 2015).


    Privacy Risk Exposure:Privacy risk exposure can be best described as potential loss resulting from the compromising of personal data sets held by a firm. This indicator is often the outcome of a Privacy Impact Assessment. Important is the probability with which a data breach can occur (based upon past experience in the firm or in similar firms).The input of such a calculation is often not more than informed guessing; therefore the indicator is more qualitative than quantitative in nature.


    Return on Privacy Investment: This indicator consists of the return of avoided potential losses because of data breaches, Annual Loss Expectancy (ALE), where ALE = single loss expectancy (SLE) * Annual Rate of Occurrence (ARO), see below. SLE describes potential losses, ARO the frequencies of such losses. Red, in the formula below, denotes the reduction in frequencies of breaches occurring (say from 10 cases 8 can be avoided, 0.8). Finally, cost of measure indicates the costs for the implementation of the protective measure. Thus,

     

    ROPI Formula 1

    If the outcome is greater than 1, the protective measure can be regarded as cost efficient by the investor. Again, the inputs into this formula are rather indicative and often subject to informed guesswork. Most of the outputs in privacy metrics are subject to this problem. Therefore, the outcome of this calculation should be accompanied by a confidence estimate regarding the quality of the outcome.

     

    Further IPACSO Reading

    Jentzsch, N. (2015) State-of-the-Art of the Economics of Cyber-Security and Privacy, IPACSO - Innovation Framework for ICT Security Deliverable 4.1. (download)

Getting Started

Which type of company are you? Choose one of the options below and get a head-start.

Framework Overview

Navigate through the different parts of the Framework

leeg

Joomla! Debug Console

Session

Profile Information

Memory Usage

Database Queries