Loading Events

« All Events

  • This event has passed.

Alex Voorhoeve and Lichelle Wolmarans (LSE) “What Makes Processing of Private Data Permissible?”

5 June 2019, 4:30 pm6:00 pm

Event Navigation

Abstract: Social networking sites (SNS) typically provide access to information, goods, and services in return for the right to process users’ personal data. In order for their personal data to be processed by SNS, users must have given legally valid consent. The purpose of such consent is to waive the user’s right to privacy, thereby making it permissible for the data controller to proceed with the specified analysis of personal data. Standard notice-and-consent regimes, as outlined by EU General Data Protection Regulation, are meant to ensure users offer such consent. In this paper, we consider and reject a leading account for assessing the adequacy of such regimes, and propose an alternative account.

According to the canonical Autonomous Authorization account, consent must be an autonomous, unambiguous, permission-giving signal. The act of consent is autonomous when the consenting party is competent to authorize the course of action, has appropriate understanding of the content and likely consequences of consent and consents intentionally in absence of coercion and manipulation (Beauchamp 2009, 55-70). When (and only when) these conditions are met, consent is morally transformative, making it permissible for the party soliciting consent to proceed.

Many experts have argued that the consent mechanisms employed by SNS fall short of ensuring that the AA account’s conditions for morally transformative consent are met (see, e.g., Solove 2013, Custers, et al. 2013, Schermer et al 2014, Hull 2015). Long, complex terms-and-conditions and privacy policies leads to information overload, leaving users unable or unwilling to invest the time necessary to form an appropriate understanding of what they are agreeing to. The nature of big data and analytics inhibits the formation of appropriate beliefs about the likely consequences of consent by rendering severely uncertain the future implications of offering up one’s personal data. Furthermore, the quantity of consent transactions users are presented with results in consent desensitisation: users automatically accept requests for consent without weighing the legal and moral implications. Finally, important cognitive biases also affect our understanding and ability to make decisions regarding the collection and processing of our personal data, leaving users vulnerable to exploitation. One such bias is price bias. Consumers are not good at differentiating between the price of a good and the cost a good (Hoofnagle and Whittington 2014). As SNS market the price of access to their service as free, users neglect factoring in the cost of granting SNS sites access to and use of their personal data. When it comes to making the cost-benefit analysis of joining an SNS, the potential benefits (social capital, entertainment, etc.) seem clear, immediate and are vividly presented whilst the costs or risks (the implications of allowing private companies to use and collect personal data) are not immediately clear and take place at some indeterminate time in the future. SNS also routinely engage in nudging, prompting users to reveal more information about themselves or their friends in exchange for small benefits such as access to an app. Under these circumstances, it is increasingly easy for SNS to coax or manipulate individuals into revealing or handing over the rights to more of their personal data than they wish to (Hull 2015). These factors jointly undermine the intentionality of users’ consent and the role it plays as a permission giving signal. Though ticking the box may be a legally unambiguous permission-giving signal, it is not morally unambiguous in light of the factors eroding the autonomy of the consent.

One response would be to revise regulation to introduce compliance mechanisms which ensure that users’ consent meets the AA view’s standards. We criticize this idea on two grounds. First, meeting AA’s standards would make the process of gaining access to SNS unduly onerous, involving demanding tests of understanding on the part of the user. The AA view’s tests would therefore create severe barriers to access to SNS, even when such access in exchange for the right to use personal data would be, on balance, in users’ interest. Second, we argue that the AA view does not take account of the power imbalance between users and SNS, whom are often (near) monopolists in terms of access to centrally important online goods and services. It follows that even in a fully autonomously authorized transaction, individuals have limited scope to enact their preferences regarding data processing,  since the terms of exchange are set by the SNS.

We argue that both problems can be solved by focusing instead on ensuring that users have valuable opportunities to access SNS and make choices regarding the processing of personal data once they have accessed the SNS. In determining the value of these opportunities, we build on, and revise, T.M. Scanlon’s Value of Choice view (Scanlon 1998, chapter 6). Scanlon proposes that we determine the value of a person’s opportunities with reference to the three reasons to value choice (1998, 251-3). The first is instrumental: the value choice has in securing ends we have reason to seek, such as purchasing theatre tickets, forming new relationships online, or to earn a living as a social media influencer. This value is conditional on the degree to which, for a given object of choice, a person’s capacities, dispositions, and conditions of choice will help them achieve these ends. It is also relative: it depends on the usefulness of his being given a particular choice as compared to other means of achieving these ends. The second is representative: the value we place on seeing features of ourselves manifested in our actions and their outcomes. An example is having our online profile reflect our tastes, judgments and abilities, for good or ill. The third is symbolic: the value of being recognized as competent to make particular choices. Scanlon recognizes that the degree to which a particular choice will help a person achieve these values will depend on a person’s tastes and decision-making abilities. Nonetheless, he posits that we should regard an opportunity set as a good one when it generally allows people secure these three values (1998, 261). On Scanlon’s view, when an individual has been given opportunities that are sufficiently valuable in this sense, it is the value of these opportunities, rather than fully autonomous consent, which justifies social arrangements that make how a person ends up depend on their choices.

Following Voorhoeve (2008), we argue that this focus on the generic value of an opportunity set is misguided. People may, due to factors for which they are not fully to be held responsible, differ in how effectively they make use of particular opportunities. It would unfairly disadvantage those who have poorer, or simply atypical, choice-making abilities if we ignored these differences. Instead, we propose that the value of a person’s opportunity set be determined by the value of the goods and evils that they can achieve through their choices taking into account how disposed they are to choose their better options and avoid their worse options.

Applied to the case of opportunities to exchange rights of control over personal data for a variety of SNS’ services, our proposed view holds that these opportunities are adequate to justify social outcomes (and  validly waive  privacy rights) if and only if a dispositionally diverse set of users, each with different preferences and decision-making abilities, can pursue their interests on SNS in a manner that strikes a decent balance between (a) the potential benefits provided by SNS; (b) the risks of harmful personal data processing by SNS; and (c) the costs of access.

Our proposed view therefore requires regulating what behavioural scientists refer to as the “choice architecture” of the SNS concerning personal data so that a variety of users can reasonably be expected to make choices that achieve their ends without undue risk of harm. A virtue of this view is that SNS cannot defend current practices with claims of  what people ought to do to safeguard their personal data, but must take account of what various kinds of people in fact do with the opportunities presented. The optimal arrangements governing the exchange of personal data for SNS services thereby becomes in part an empirical question of what enables various types of users to make appropriate choices to avoid harm and achieve good outcomes.

It is noteworthy that, on this view, we need not always aim for fully autonomous authorization (in this, it aligns with suggestions by Schermer et al 2014). For example, in order to avoid information overload and desensitization, an exchange of basic access for the right to strictly circumscribed forms of data processing might not need to meet demanding standards of understanding and competence. (More demanding standards could be employed for higher-risk exchanges, if doing so would, given people’s decision-making abilities, be beneficial to users.) In requiring that agreements to access SNS be structured to the advantage of dispositionally diverse users, the view also addresses the power imbalance between SNS and their users.

Details

Date:
5 June 2019
Time:
4:30 pm – 6:00 pm
Event Category:

Venue

LAK 2.06
Lakatos Building
London, WC2A 2AE United Kingdom
+ Google Map
Website:
http://www.lse.ac.uk/