What are 10 unresolved paradoxes of the economy

Paradoxical data protection behavior


Along with the digitization of many areas of life, large amounts of personal data from consumers are recorded and analyzed by companies and institutions. Therefore, the responsible handling of this data is one of the most pressing issues of our time. Although there are numerous statutory regulations (e.g. GDPR) and an increasing number of companies that voluntarily commit to data protection, many companies and institutions also take advantage of the carelessness of consumers. This is facilitated by the fact that many consumers state that they pay attention to their privacy, but only a few take the necessary measures. This attitude-behavior discrepancy (privacy paradox) can be explained on the one hand by a rational cost-benefit calculation, in which consumers offset the benefits of certain products (e.g. through personalization) with the disclosure of their data. On the other hand, situational influences (e.g. little time) or cognitive distortions (e.g. illusion of control) can reduce data protection concerns in these situations. Against this background, we introduce the privacy paradox and show the state of the literature, focusing on situational and cognitive distortions. Finally, the concept of the three privacy gaps is introduced and a framework for future research is developed.


In the digital age, many companies and institutions collect and analyze unprecedented quantities of personal data amounts. Therefore, the responsible handling of this data is one of the most pressing issues of our time. Although many countries have adopted data protection laws and many companies voluntarily commit to data protection, there are still many organizations that are exploiting the lack of consumer awareness in situations where data privacy is relevant. Although consumers often state their concerns about their online privacy, only a small share of them actually take the necessary actions to preserve their privacy, referred to as the privacy paradox. On the one hand, the privacy paradox can result from an individual rational calculus where consumers offset the benefits of certain products against the protection of their data. On the other hand, there exist many situational influences (e.g., little time) or cognitive biases (e.g., control illusion) that might reduce consumer’s privacy concerns in certain situations. Against this backdrop, the paper first discusses the privacy paradox and captures the current state of privacy scholarship, focusing on the situational and cognitive biases. Finally, we introduce the concept of three privacy gaps and develop a framework for future research.


In the course of the digitization of many areas of life, consumer habits and the marketing of products have fundamentally changed (Kannan 2017; Kumar 2018; Rust 2020). These developments were particularly driven by the use of personalized search and decision aids as well as personalized products and services (e.g. personalized playlists on Spotify). However, effective personalization also means that large amounts of personal data from consumers are recorded by companies and institutions and automatically analyzed (e.g. movement data, Satariano 2019; facial recognition software, Feng 2019). In view of the automated processing of this vast amount of data, the responsible handling of sensitive data by companies and public institutions is one of the most pressing issues of our time (Acquisti et al. 2015). With this in mind, legal ordinances have been issued in recent years that regulate the handling of personal data by organizations (e.g. General Data Protection Regulation of the European Union, GDPR). Some companies volunteer to protect the privacy of their employees and customers to an extent that goes beyond the statutory provisions and regulations (BMJV 2020; Lobschat et al. 2020). As a result, many consumers see politics and business as a responsibility - and less themselves (PWC 2020). This transfer of responsibility on the part of the consumer leads in many situations to careless handling of their data by consumers, which in turn is exploited by some companies (e.g. through "Dark Patterns", Gray et al. 2018). This becomes clear in the following example: Internet presences of companies often have very detailed general terms and conditions, which are seldom read and understood completely by the visitors of a website, which often leads to "blind" consent (Acquisti et al. 2020; SVRV 2017). Efforts are being made to reduce the length and complexity of the terms and conditions displayed in order to protect consumers (as proposed, for example, by the Advisory Council for Consumer Affairs, SVRV 2016), but so far there are no binding legal regulations for this. There are also discussions about how strongly the default settings for cookies should be geared towards protecting privacy (Acquisti et al. 2017, 2020). Consumers, on the other hand, rarely have the necessary capacities (e.g. time, knowledge, technical capacities) to determine the “appropriate” behavior in data protection-relevant situations. To make matters worse, the individual marginal costs for additional data protection for consumers are rising rapidly, among other things. by so-called Lock-in effects. The benefit of social networks lies in the high level of engagement of a growing number of users, which in turn strengthens the influence of these networks and makes it difficult for individual users to not use the networks. This so-called "privacy externality" can also lead to the fact that the social need for more data protection disappears and consumers who have an increased interest in securing their data have to spend more and more time and money on it (Acquisti et al. 2016 ).

Due to these developments, consumers should be aware of their right to informal self-determination and their responsibility for the use of their data (Trabandt and Lasarov 2020). In fact, there are some initiatives in Germany that support consumers in such projects, including: by providing numerous information offers (e.g. Selbstdatenschutz.info 2020; datenschutz.rlp.de 2020; bfdi.bund.de 2020). In addition, consumers often state that they pay attention to their privacy overall and that they check as often as possible which personal data they disclose. However, observations from research and practice suggest that knowledge of topics relevant to data protection (e.g. instruments for protecting personal data, knowledge of the consequences of misuse) and a generally positive attitude towards data protection are nonetheless ineffective in many specific situations and are of no consequence Consumers are not converted into concrete behavior (Acquisti et al. 2015, 2020; Martin and Murphy 2017). In some situations, even those consumers who are generally concerned about their data security are careless about the protection of their personal data (Acquisti et al. 2015).

Against this background, the present article first introduces the privacy paradox and then shows the state of the literature, focusing in particular on situational and cognitive distortions that help explain the discrepancy between personal data protection concerns and actual, often careless behavior digital services can contribute. Finally, the concept of the three privacy gaps is used to develop a framework for future research.

The "privacy paradox"

Numerous studies on data protection behavior revolve around the observation described above that consumers often express concern about their data security, but this concern is not expressed in concrete action in many situations. This observation is also known in the literature as the so-called privacy paradox and describes the discrepancy between general (positive) attitudes of consumers towards data protection and their actual (negligent) behavior (Aguirre et al. 2015; Norberg et al. 2007). This attitude-behavior discrepancy can be explained on the one hand by a rational cost-benefit calculation, in which consumers offset the benefits of certain products and services (e.g. through personalization) with the disclosure of their data. A rational explanation for this is provided by the theory of the rational decision (Behavioral Decision Theory, Kahneman 2003), according to which consumers base their decisions in complex, uncertain and risky situations on a rational cost-benefit calculation, i.e. in our case the benefit of a Weigh the product associated with personalization against the associated costs of disclosing personal data (Aguirre et al. 2015; Dinev and Hart 2004; Mothersbaugh et al. 2012). On the other hand, situational influences or cognitive distortions can reduce individual data protection concerns in certain situations. These psychological distortions include habituation effects (Adjerid et al. 2018; Melumad and Meyer 2020), the influence of the social environment (Acquisti et al. 2012; Carbone and Loewenstein 2020; Chellappa and Sin 2005; Schumann et al. 2014; White 2004) or the illusion of complete control over disclosure of one's own data (Acquisti et al. 2013; Bleier and Eisenbeiss 2015a, b; Martin et al. 2016; Mothersbaugh et al. 2012; Tucker 2014; Xu et al. 2012).

The gap between self-reported general attitudes and behavior in specific situations is known from related research areas (e.g. sustainable consumer behavior, White et al. 2019) (Webb and Sheeran 2006). However, the term “paradox” gives a misleading impression for this attitude-behavior gap, since with regard to digital services it can certainly happen that, from the consumer's point of view, the benefit gained from the data transfer exceeds the associated costs and an apparently negligent one Behavior is not paradoxical but rational. Even strict attitudes towards data protection do not contradict the liberal handling of data in certain situations when other needs are far more important (e.g. in medical emergencies). Therefore, especially recently, it has been increasingly questioned whether the paradox even exists in this form on the individual level (Norberg et al. 2007; Solove 2011; Solove and Schwartz 2020). In view of possible long-term negative effects and the immense importance for society and the economy, the debate about the privacy paradox should nonetheless be continued (Martin 2020). In addition, numerous paradoxical behaviors cannot simply be traced back to rational cost-benefit considerations, because it happens that consumers disclose a disproportionate amount of personal data without expecting any significant consideration or subjective benefit. Several psychological models help explain this paradoxical behavior. The central factors will be discussed in the following on the basis of the available literature.

Overview of the literature

Numerous studies in the fields of psychology, behavioral economics and consumer behavior research have dealt with the psychological factors influencing data protection-relevant behavior of consumers in recent years. These factors expand the rational cost-benefit calculation to include situational and contextual influences up to emotional and irrational decisions (Acquisti et al. 2020). Table 1 summarizes the relevant conceptual and empirical studies of the past few years. The table includes a brief description of the respective psychological influencing factors and the data protection-relevant behavioral consequences and refers to the relevant literature.

In the following, we will first look at studies on the behavior-relevant consequences. Previous studies have mainly dealt with two behavioral consequences. On the one hand, it was examined to what extent data protection-relevant factors influence the willingness to consume a certain product. On the other hand, it was researched to what extent (how much, to whom, etc.) consumers disclose their personal data. With a view to the influencing factors, we consider the personal benefits that would result from the use of digital services, as well as data protection concerns and the desired privacy of users. These flow into the cost-benefit calculation for consumers. The core of the literature review concerns insights into situational and cognitive biases when weighing benefits and privacy concerns.

Consequences of behavior

Securing privacy. First of all, consumers differ in terms of their willingness to disclose personal data in certain situations. The amount of personal data provided can, in turn, significantly affect the quality of use of digital products and services, e.g. B. in social networks. A typical application example is also the use of smart home objects, the meaningful use of which is only possible through the processing of personal data (e.g. passive listening in on intelligent loudspeakers of a voice-controlled, internet-based assistant) (Adjerid et al. 2018; Carbone and Loewenstein 2020; Acquisti et al. 2012; Brandimarte et al. 2012, 2013; Goldfarb and Tucker 2012; John et al. 2011; Mothersbaugh et al. 2012; Wirtz and Lwin 2009).

Use of digital products or services. With a view to the use of digital services, the literature often describes the entire customer journey, i.e. H. processes of information acquisition, the purchase of products and services, as well as their recommendation are considered. For example, studies on the acceptance of personalized offers the interest in further information prior to the purchase or the possible acquisition of the product (Aguirre et al. 2015; Bleier and Eisenbeiss 2015a; Schumann et al. 2014; Tucker 2014). Frequently examined behavioral consequences in this context are click through rates (Aguirre et al. 2015; Bleier and Eisenbeiss 2015a; Tucker 2014), the purchase of the product (Goldfarb and Tucker 2011a, b; Miyazaki 2008; Tsai et al. 2011) and the willingness to recommend of consumers or their willingness to share their negative opinion about the product with others (word-of-mouth). For example, Miyazaki (2008) shows that covert use of data collection technologies (e.g. cookies) can lead to negative word of mouth.

Cost-benefit calculation

Personal benefit from using digital products or services. In data-sensitive situations, consumers often weigh up the benefits they “pay” for by disclosing their personal data. Accordingly, studies confirm that people deal with the protection of their personal data differently if they consider a certain product to be useful (Aguirre et al. 2015; Awad and Krishnan 2006; Bleier and Eisenbeiss 2015a; Gabisch and Milne 2014; Goldfarb and Tucker 2011a, b; Lasarov 2020; Mothersbaugh et al. 2012; Tucker 2014; White et al. 2008). White et al. (2008) show, for example, that a high level of use of a product leads to fewer worries about the associated disclosure of privacy. On the one hand, this benefit can be monetary: Gabisch and Milne (2014) show that users are more willing to disclose their data if this is associated with financial incentives. On the other hand, the benefit can also relate to other personal interests, e.g. health interests when using the Corona warning app (Dehmel et al. 2020; Lasarov 2020). In addition, increased personalization and the individualized tailoring of products and services can be considered very useful. Gabisch and Milne (2014) examine, for example, whether consumers get the feeling that the personalization of services and online products gives them enough benefit to justify the disclosure of their personal data. However, a high degree of personalization can also trigger distrust of the company and reactance, which in turn reduces the use of the company's products (White et al. 2008). Here, too, a paradoxical effect is accordingly visible, as personalization on the one hand increases the benefit, but on the other hand it can also trigger reactance and mistrust.

Privacy concerns. The general and situational concerns about the handling of their personal data have a significant impact on the behavior of consumers in data protection-relevant situations (Aguirre et al. 2015; Acquisti et al. 2015; Goldfarb and Tucker 2012; Martin 2015; Sheehan and Hoy 2000). The general privacy concerns relate to consumers' beliefs, attitudes and perceptions about their privacy (Smith et al. 1996). In research, these are often recorded using the so-called “consumer privacy concern scale” (Smith et al. 1996; Malhotra et al. 2004). In the literature to date, data protection concerns have been examined both as predictors, but also as moderators and consequences relevant to data protection (Xu et al. 2012).

Situational and cognitive distortions

Situational and cognitive distortions describe influencing factors that influence the specific data protection-related actions of consumers beyond rational cost-benefit considerations and lead to deviations from their actual (general) attitudes, convictions and intentions. This can be the application environment (e.g. the technological platform used, Melumad and Meyer 2020) or emotions (Dowling et al. 2020). The previous studies looked at information asymmetries, trust and transparency, illusions of control, the social environment, habituation and perceived vulnerability, among other things.

Information asymmetries. Consumers often do not know when, how and to what extent companies collect and process their data and which other services and companies also have access to this data. The main reason for this is that consumers mostly lack the necessary (cognitive, temporal) capacities in everyday life or they do not have the necessary technical or legal background knowledge to understand, for example, complex data protection regulations. These so-called Information asymmetries and the resulting consequences (e.g. distrust of companies, low willingness to buy) are very difficult to reduce again (Acquisti et al. 2013; Habib et al. 2018; Hoofnagle and Urban 2014; Martin and Nissenbaum 2016; Mothersbaugh et al . 2012; Martin 2015). The mere notification of the company's data protection guidelines cannot inevitably prevent consumers from perceiving information asymmetries and being critical of data protection (Martin 2015). In addition, it plays a role how data protection-relevant information is conveyed to consumers by companies. Vail et al. (2008) show, for example, that “traditional” and detailed data protection guidelines are more likely to be accepted by consumers as trustworthy, although such guidelines do not reduce information asymmetries due to their length and complexity. Thus, a measure that is rather detrimental to reducing information asymmetries can paradoxically increase consumer confidence in the company.

Trust and transparency. Trust in organizations can significantly influence the extent to which consumers share their personal data with them (Martin and Murphy 2017). In particular, a credible and transparent data protection policy was examined in the literature as a trust-building success factor. A credible data protection policy of the company can strengthen consumer confidence (Lockamy and Mothersbaugh 2020), e.g. B. through independent data protection seals and voluntary information on data protection (Gabisch and Milne 2014; White 2004). However, a high level of trust also goes hand in hand with increased expectations on the part of consumers with regard to the data security of companies, the failure of which can have a negative impact on the relationship between consumers and companies (Martin 2015; Gabisch and Milne 2014). Careless handling of customer data can lead to a loss of credibility for companies and have significant negative financial and legal consequences (Malhotra and Malhotra 2011; Romanosky et al. 2014). On the other hand, the company's transparent handling of the data used can strengthen consumer confidence (Martin et al. 2016; Norberg and Horne 2014; Wirtz and Lwin 2009).

Illusion of control. Consumers' perceived control over their own data is considered to be an important factor influencing their behavior in data-sensitive situations (Acquisti et al. 2013; Bleier and Eisenbeiss 2015a, b; Martin et al. 2016; Mothersbaugh et al. 2012; Tucker 2014; Xu et al. 2012). On the one hand, the feeling of control among consumers can reduce reactance and privacy concerns towards the company (Acquisti et al. 2020) and thus improve the relationship between consumers and companies (Tsai et al. 2011). However, the perceived control can also trigger a paradoxical effect, which was introduced in the literature as the so-called control paradox (Brandimarte et al. 2013): The perceived control over one's own data leads to the fact that this is more carelessly disclosed by consumers (Brandimarte et al. 2012, 2013; Norberg and Horne 2014). With regard to the control over one's own data, the opposite perception can arise, namely the feeling of resignation. In this case, consumers feel helpless and powerless and are convinced that they cannot protect their data anyway or that as active citizens they cannot survive in a modern world without digital participation (Acquisti et al. 2020; Barassi 2019 ; Draper and Turow 2019). Ironically, both the perception of control and resignation can have the same consequence: Consumers are negligent with their personal data. Furthermore, the “nothing to hide argument” can be observed in this context, which says that state measures serve to monitor illegal activities and therefore do not impair people who behave in accordance with the rules (Solove 2011). Here, the feeling of supposed control can arise simply through compliance with the law, although monitoring measures are in many cases independent of suspected cases.

Social environment. The social environment plays an important role in the way consumers handle their personal data in two ways. On the one hand, the social environment often serves as a reference for the “correct” behavior in certain situations and supports consumers in determining their attitudes towards a certain topic. In fact, studies have confirmed that consumers are more likely to reveal their personal data when using a certain technology if they have already observed this behavior in others (Acquisti et al. 2012). In addition, consumers can be influenced in their data protection-related decisions by social norms, herd behavior and trust in other platform users and reciprocity ideas (Acquisti et al. 2012; Chellappa and Sin 2005; Schumann et al. 2014; White 2004) . Finally, the need to share personal information with other people can have a significant impact on how much consumers pay attention to data protection aspects (Carbone and Loewenstein 2020).

Habituation. The way consumers handle their personal data depends not least on how much they get used to certain circumstances and situations. For example, data protection-related problems may be the focus of attention in a certain situation or for a certain period of time (e.g. when a data protection scandal is reported in the media), but other issues come to the fore over time, while data protection-related problems remain unsolved or even continue to develop adversely for consumers unnoticed (e.g. through agenda setting). This leads inter alia. to the fact that consumers get used to circumstances that could actually harm them (Adjerid et al. 2018). In addition, the technological platform used can lead to habituation effects and influence the handling of personal data. Melumad and Meyer (2020) show that consumers are more likely to share personal information on social networks if they use a smartphone instead of a PC.

Perceived vulnerability.