The Political Economy of Privacy Enhancing Technologies

02/17/2022
by   Partha Das Chowdhury, et al.
University of Bristol
0

PETs have increasingly become vital empowering tools in today's highly datafied society. However, their development has been primarily concerned with improving usability and ensuring confidentiality online. Privileging these considerations might unintendedly lead to fixed ideas about users, but diversity of thought, action, ability, and circumstance play a fundamental role in the distortion and acceptance of any PETs. In this paper we elaborate some of the manifestations of the distortions, like inadequate and exclusory design, and uneven distribution of costs and benefits. Drawing on Amartya Sen's capability approach we propose that a normative evaluation of personal, social, and political diversities can be used as a foundation to conceptualize and develop PETs. We outline a research agenda based on this proposition and suggest pertinent empirical and methodological research paths. Our contribution offers an evaluative space to make inter-personal comparisons to inform the development of PETs.

READ FULL TEXT VIEW PDF

Authors

page 1

page 2

page 3

page 4

09/03/2019

Cross-Cutting Political Awareness through Diverse News Recommendations

The suggestions generated by most existing recommender systems are known...
08/23/2021

Earth Observation and the New African Rural Datascapes: Defining an Agenda for Critical Research

The increasing availability of Earth Observation data could transform th...
06/09/2022

How Algorithms Shape the Distribution of Political Advertising: Case Studies of Facebook, Google, and TikTok

Online platforms play an increasingly important role in shaping democrac...
08/16/2019

A Survey on Computational Politics

Computational Politics is the study of computational methods to analyze ...
11/26/2021

Machines and Influence

Policymakers face a broader challenge of how to view AI capabilities tod...
03/15/2022

(Re)Politicizing Digital Well-Being: Beyond User Engagements

The psychological costs of the attention economy are often considered th...
11/20/2019

The politics of deceptive borders: 'biomarkers of deceit' and the case of iBorderCtrl

This paper critically examines a recently developed proposal for a borde...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

1 Introduction

While privacy has been recognised as a fundamental right, there has been debate as to whether technical and regulatory interventions adequately allow everyone, irrespective of their circumstances, to exercise this right [guberek2018]. This is tied to the fundamental question of how privacy protection tools are conceived or the assumptions considered when designing and building such tools. For instance, PETs have often seen the human at the other end of systems as some passive (often languid) user (who needs a favor). This notion (of PETs) is yet to account for the multiple ways people can be re-identified, profiled and harmed. It does not reflect people’s diverse interactions (or lack of) with technology and their short or long-term social, political, and economic circumstances. Discounting human agency amidst its diversities while developing such systems, may not only hinder adoption due to technical misfit but may be unintentionally exclusory [salma2017].

Other fields, for example, security economics, have made considerable progress in widening the discussions surrounding the end users [odylzko2003, acquisiti2016, acquisitilikes2021]. Human Computer Interaction (HCI) and the usable security community has made a strong case for putting humans at the heart of systems design [schlesinger2017intersectional, adamss99, dodier2017paternalistic] through cross-pollination with the social sciences and experimentation with participatory and co-design methods. While these efforts are recognized in other areas of technological development, the research community behind PETs has remained mostly concerned with corrective mechanisms and questions of usability [ruoti2019, vemou_classification_2013]. We argue that there is a need to expand the focus of PETs in order to account for multiple formulations of well-being (i.e., beyond utilitarian views) sought by individuals and the diverse realities of age, education, abilities, gender, race, and socio-political situations [senrational1994].

Drawing on the work of Amartya Sen [sentanner], we posit that a capability approach111Amartya Sen articulated the capability approach first in Tanner lectures on Human Values, delivered at Stanford University in 1979. Available on Tanner Lectures website, reprinted in John Rawls et al., Liberty, Equality and Law (Cambridge: Cambridge University Press, 1987) based evolution of PETs will enable individuals to achieve privacy in a manner they are able to and they deem valuable. We argue that the capability approach brings an evaluative space to systematically assess individuals’ opportunities to live a private life by putting their freedom of choice right at the heart of that assessment. In this paper, we present and discuss the consequences of developing PETs without a clear understanding of the diverse realities of individuals and argue how the capability approach can be used in research for the design and development of PETs. In summary, our goal is to provide PETs developers and the research community with insights on a normative evaluation of personal, social, and political diversities towards conceiving PETs.

This paper is structured as follows. In section 2 we discuss the consequences of developing systems without a clear understanding of the diverse realities of individuals. Then, in Section 3 we give a brief overview of the capability approach. Based on this, we propose a research agenda and outline key themes and methodologies in section 4 which is followed in section 5, by a brief discussion connecting capability approach in the light of some relevant examples. This is followed by conclusions in section 6.

2 Consequences of Discounting Individual Realities

Conventional approaches to designing PETs are not adequately sensitive to the personal, and social circumstances of people [hausman_2007]. The prevailing view of privacy as confidentiality focuses exclusively on protecting data which is considered personal or sensitive, with the assumption that users will have adequate skills to protect their data [gurses_pets_2010]. While we acknowledge the research towards engineering privacy at the intersection of plurality of interests and contexts [gurses2011engineering], this view can be expanded to account for the myriad ways in which people’s privacy may be compromised through other means of identification. One might for instance draw attention to the growing ethical concerns over the use of people’s digital traces, including seemingly innocuous data, for purposes of behaviour prediction, cross-referencing, profiling and policing [van_der_sloot_regulating_2020, brayne_big_2017]. In the next subsections, we outline some of the consequences of discounting individual realities manifesting in adverse behaviours and harms linked with the use of information systems.

2.1 Information Overload, Asymmetries and Moral Hazards

Citizens are confronted on a daily basis with myriad data transactions with information systems, yet very little is known about what goes on in the background or what exactly are the quid pro quos of such transactions. How data is collected, transmitted and processed by information systems has remained largely concealed behind black boxes [chaum1985]. Concerns with the negative consequences of data sharing with commercial and governmental entities materialize in a range of adverse reactions including fear, lying, reticence and feelings of resignation about participating in online activities [draper_corporate_2019, pink_data_2018]. We discuss some of these here.

Resignation

A widespread assumption among service providers is that individuals will be able to process complex legal jargon and take an informed decision. Shifting the burden of obtaining informed consent to citizens is at odds with adequately empowering them to take control 

[anthonysamy2013social, fabianel2017]. Instead, this has led to a regime of misinformed and often coercive consent [millett2001]. Evidence of this is the proliferation of purposely misleading consent controls or dark patterns [acquisitilikes2021, luguri_shining_2021]. Such asymmetrical relations engender different adverse behaviours and feelings of resignation among users. Research shows that there are disconnects between the stated privacy policies and the controls that are used to implement them [anthonysamy2011]. This means even if a trained individual is able to navigate through complex policies, there might not be adequate controls to enable the functioning of a private life. Service providers rather take advantage of people’s need to access online services despite their inability to process the complexities. Contrary to the view of the privacy paradox, users have little choice than to accept obscure terms and conditions in exchange for online services if the risks are not well understood and they are overburdened with information [solove2021].

Lying

Similarly, requiring the possession of credentials as an obligatory point of passage for the provision of online services might lead to a situation where those who would not posses them, need to pretend that they do so. Ramokapane et al. outline the use of lying to access services online [ramokopane2021]. The tendency to lie leads to, for example, moral hazards in cyber-insurance [vakilinia2019]. To counter the effects of lying, service providers resort to intrusion to ascertain the eligibility for access to certain goods or services. In such cases, the incentives are not only poor for the individuals but also for the entities that store the data. Healthcare providers too demonstrated reluctance to adopt electronic medical records (EMR); the adoption of which could have saved lives, particularly infant mortality. The complexities of the regulations acted as a barrier in case of the health care providers [miller2012].

Reticence and fear

In the last decade, there has been increasing awareness in the public of the ubiquity of surveillance enabled by the huge amounts of data held by commercial actors (notably social media platforms) and the use of biometrics and face recognition technologies by governments. Numerous studies have evidenced the diverse

chilling effects manifesting in practices of self-censorship, self-restraint or change of behaviours online such as limiting sharing of pictures or other information on social media  [manokha_surveillance_2018, forte2017]. Humbert et al. conducted a survey on interdependent privacy risks of individuals by the activities of their friends. Their findings highlight the impact of irresponsible online behavior of individuals on their friends or family who do directly use technology [humbert2020]. These reactions limit the function of individuals because they lose out on the benefits of participating in the digital economy or are afraid of expressing themselves freely.

Gaps.  An assessment of the abilities of individuals with diverse social and education background to process complex technologies, legal documents and/or respond to ubiquitous connected devices, is missing in the evolution of PETs. The extant environment makes it difficult for many to achieve the functioning of a private life.

2.2 Individuals Being Treated as Less

The idea of provisioning of PETs is often seen as a special benefaction on individuals, about whom a number of assumptions are made. A prevalent thinking among developers is of fixing the user. This is true in the way application developers perceive their users as well as the manner in which application developers are perceived by API providers. For both instances the diktat is that users should behave in a particular manner else they are threats to the system. A case in point is the futility of the expectation that users’ would heed to SSL certificate warnings. For instance, Sasse argues against the over-use of warning messages and concludes that repeated use of warnings can be counterproductive, defeating the purpose they were meant to serve [sassescaring2015].

The impact on users and non-users

The field of science and technology studies, and notably feminist scholarship, has been long concerned with how users are configured based on detached and self-referential models by developers (who commonly belong to privileged groups). [id=PDC]Systems are typically built on tendentious assumptions driven by the point of view of the providers on what is right for the user. This has led to inadequate generalisations manifesting in problems of misalignment (e.g. gendered technology) or exclusion on the basis of race, age, dis/abilities or other traits [oudshoorn_configuring_2004, doorn_theorizing_2008]. In the context of PETs, a prevalent assumption has been that users should be responsible for their own privacy which could be enhanced by means of anonymity, encryption and secure channels of communication [gurses_pets_2010]. Users are thus imagined as possessing the right set of knowledge, skills and resources to find and make use of PETs. However, there might be a disconnect between developers’ assumptions about the idealized user and the very specific needs of people, for example of those in high-risk or vulnerable situations such as whistle blowers, protesters or refugees. A cognate issue is that developers may be biased towards users interfacing with technology while being blind to a vast number of non-users (such as the elderly or disconnected), who may not directly interact with online systems but whose data may well be collected by various information systems. Indeed,[id=PDC] the discussion of threats [id=AD]is largely restricted to the realm of the web and the internet browser, without consideration of those in risk of being surveilled by other means of data collection, when such exposure leads to profiling, identification and other dangers [solove_privacy]. With the advent of big data, a number of ethical issues have come to the fore around the use of peoples’ digital traces and statistical prediction for the automation of decisions related to access welfare, employment and public services, credit scoring [barocas_big_2016]

There is an exploitable analogy of this thinking with the contractarian model of jurisprudence. Immanuel Kant and Rousseau proposed legal institutions first that would then effectively compel citizens to follow them else they are outlaws. The approach is more geared towards ensuring the survivability of the institutions rather than being sensitive to the citizens they are meant to serve [senjustice]. Similarly for a vanilla model of privacy systems design, privacy policy is decided first, then mechanisms to implement said policy are decided. Anyone whose behavior deviates from the specifications of the protocol designer is an attacker. The protocol designer rarely provides reasons for their expectation of a particular allowed behavior. The problem with these assumptions is that they make systems hard and unpleasant to use [pdc10]. HCI research over the last few decades has argued strongly against blaming and fixing the individual towards being sensitive to their realities [adamss99, sasse2015]. These pandemic times have required people with various levels of backgrounds and deprivations to participate in online activities, for example, students from the poorest parts of the world as well as the elderly who might not be technically conversant222https://www.thelancet.com/journals/landig/article/PIIS2589-7500(20)30169-2/fulltext. There is a clear need for PETs to study and build [id=AD]for vulnerable groups who are less privileged, less abled or are in risky situations [wang2018] and who may be inadvertently rendered invisible by developers.

The impact on developers

The contractarian attitude is also reflected in how security API developers relate to their primary users (developers). Applications developed with third party APIs and responsible for the privacy protection of their users often fail to do so. Hedin et al. studied information flow by included libraries provided by browser APIs and found that some sites ensure it does not leave the browser, others share it with the originating server, while yet others freely propagate it to third parties [hedin2014]. Acar et al. argue for a better understanding of the motivations and priorities of developers rather than blaming them for not being mindful of security. They stress the need for developer-centered studies to understand the challenges that developers face when using security APIs, and the resources available to improve the usability of these APIs [acar2016]. The APIs, used by developers, are not easy to use and to add to the challenges the documentation to use them safely is not easily available or comprehensible. They sometimes interfere with the functionalities of the applications.

Gaps.  PETs have an expectation of a specified behavior as well as adequate expertise from individuals who are supposed to benefit from them. However, individuals are active agents, acting and doing things on their own. The gap lies in accommodating individuals who might not behave in a particular way as specified by the PETs designer.

2.3 Narrowly viewing individual attitudes towards privacy

A manifest shortcoming of failing to recognize humans in different contexts and cultures has resulted in the academic view that individuals do not value their privacy based on their seemingly contradictory online behaviour. For instance, based on a review of literature on privacy behaviour of individuals, Barth et al. outline rational and irrational factors, lack of information and evaluation of risks that go into the privacy behavior of individuals [barth2017]. Acquisiti et al. review the growing recognition of behavioral along with utilitarian aspects of privacy decision making. The depiction of individuals as rational beings albeit selfish appropriating a utility function led to the characterization of privacy paradox [acquisiti2004, acquisiti2016]. This view however has attracted criticism over the limitations (e.g., ignoring the relativism) in the way such conclusions are arrived at [solove2021]. The emergence of nudges to different types of users based on their preferences reflect a paternalistic tendency among the interventions [acquisitinudges2017].

While the surface manifestations are studied in the context of HCI as well as in the economics of privacy literature, we scrutinise if these manifestations are rational. Social choice theory is rich with research postulating rationality as one of many outcomes. When a wedding cake is cut some would want the icing while some the cake; however in most such cases individuals would seldom pick up the largest slice of the cake. This behaviour is inconsistent with the usual formulation with rationality as maximisation of utility. However, Sen describes it as ’menu-dependent behavior’ given an individual’s presumptions about how others will behave. Such behaviors broaden the scope of well being to include social traditions, imitation, as well as behavior driven by morality, sympathy and cohesion among others [senrational1994]. When it comes to using online systems humans might have diverse perfectly rational reasons to trade-off their privacy (e.g., being overburdened with information or needing to access a service quickly), which does not necessarily signify carelessness, naivety or indifference towards privacy.

Gaps.  The distinct shortcoming is assuming away the revealed preferences of individuals as their reluctance towards privacy. The functioning of a private life will need to assess and accommodate these preferences which are moderated by social dynamics.

2.4 Creation of winners and losers

People are at the center of the debates and mechanisms surrounding privacy and technologies to support privacy. PETs can potentially rearrange power between individuals and large corporations/nation states who collect, store, process and benefit from information about their customers/citizens. On the other hand, the formulation of a uniform set of requirements of privacy as universally beneficial for everyone across contexts is rather canonical and has inadequacies in various contexts [jacobchicago]. Such presumed uniformity across contexts manifested into PETs which failed to adequately adjust to variation. For example Beyleveld points out that true anonymization in the context of medical research, can violate privacy rather than protecting it. The legal provision of the right to know as part of privacy legislation can be compromised by anonymization techniques [chadwick2011].

The multidisciplinary field of surveillance studies has extensively debated how commercial and political interests around surveillance engender multiple ethical tensions and creates winners and losers [andrejevic_big_2014, lyon_surveillance_2014]. While disclosing certain information in certain contexts is fairly unproblematic (e.g., to access students discounts), in more complex cases like criminal records, the degree of disclosure may directly impact equal opportunities for ethnic minorities [jacobchicago]. The situation is equally complex in the context of medical research. The absence of a transparent verifiable data protection regime could directly affect [id=AD]legitimate and positive uses of data in medical research [nuffield]. On the other hand the web ensures that misdeeds of individuals are permanently stored leading to discrimination based on past behaviour [brandimarte2015]. However, enabling individuals to delete their unpleasant past is fraught with economic and political influences. It is not easy for them to delete information they do not want to have in public domain about themselves [ramokopane2017]. Individuals as organised groups are politically weak to sustain the political pressure required for effective regulatory control and regime333The Moral Character of Cryptographic Work, Phillip Rogaway 2015 IACR Distinguished Lecture. Politics remains extremely relevant in the effectiveness of regulatory controls and the benefits of broken control regime accruing to whom are pertinent questions in the realm of provisioning of PETs as a social good.

Gaps.  PETs require appreciation of the individuals in their social, economic and political context; the ongoing tensions; and evolve an understanding of the winners and losers, they might end up creating.

3 Capability Approach

Sen proposed the Capability Approach as a framework of thought and a formula to make interpersonal comparisons of welfare. Intrinsic to the notion of capability is an active individual with its beings and doings [sencapability]. The important primitives of the approach can be summarised as:

  • Functionings - Functionings are beings and doings of a person. For example, living a private life is functioning.

  • Capabilities - This resembles the idea of opportunity or advantage that an individual has, to achieve from the alternative set of functionings. It is a set of vectors of functionings.

Functionings are more related to living conditions, whereas capabilities denote the ability to achieve a particular functioning. Capability approach sees the person in it’s social and political realities. In terms of formalizations, Sen [senformalization1985] and Robeyns [robeyns2001] presented the Capability Approach as:

[backgroundcolor=gray!10,linewidth=0.1pt]

If be the vector of commodities possessed by person and converts the commodities into corresponding characteristics. The function converts the characteristics into functionings s.t

.

The function is specific because it depends on individual conversion factors, and each individual will choose a of the set .

Robeyns extends the original formulation to account for social and environmental factors (e.g., policies, social norms, infrastructure) to be denoted as . Then

.

For a given commodity vector , is the set of functionings feasible for a person where .

For any where is the set of entitlements (commodities) the capability (or feasible functionings) is determined as where and .

Regarding PETs, the significance of the capability approach is the evaluative space it offers to make a reasoned judgment of whether a target social group is capable of using and responding to a particular system. It can inform the ingredients of the system that would be appropriate for the particular group. If we view PETs as a utility or service that enables a functioning of a private life, then merely possessing the service will not enable the functioning. More is required to use the service; skill, intelligence, physical ability as well social and political environment. Exercising privacy, like any social good, would be highly diverse and dependent on age, gender, education, ability, and other changing circumstances of active individuals [seninequality] 444For example, transport as a social benefit means different things to different individuals. For someone without legs, a standard bicycle can never be an effective means of transport, and offering a cycle would be inadequate, to say the least [crocker_robeyns_2009].. Moreover, this would attract attention to what information is necessary to make an evaluative judgment of personal, social, and political circumstances for the provision of social goods. This notion is a departure from the resource (only) based view, which assumed that having PETs will allow everyone to have a private life [ruoti2019].

The specific relevance of the capability based provisioning of PETs can be further enforced by the observed diversity in expectations from and commitments of various stakeholders of any defined system [bruce2013, rashid2016]. Welfare economics have devoted considerable effort towards formally understanding the “individual” (i.e., their abilities and aspirations) within their social and political realities, as a means to deliver common goods and services effectively  [sen1992].

Significance.  We argue for an evaluation that will inform whether everyone is in a position to effectively benefit from the resources (i.e., PETs), irrespective of their deprivations. The capability approach offers an opportunity for designers and developers of PETs to build on a critical assessment and understanding of individual realities.

4 Towards a plural view of PETs: An agenda for research and innovation

The preceding sections highlighted the need for a sufficient assessment of the real opportunities diverse individuals have to achieve the functioning of a private life. The Capability approach explicitly departs from welfare evaluations based on the availability of resources and/or policies and is based on ethical individualism. This means an explicit emphasis on an assessment of individual abilities to achieve the functioning in a manner they have a reason to value. The natural advantage of this granularity is that diversity will not be subsumed under broader entities yet preserve the interdependence among social groups [robeyns2003]. In this section, we propose a research agenda aimed at making those assessments in a rigorous manner.

Figure 1: The research themes with respect to the broad research agenda

Figure 1 gives an overview of areas of research that should be considered to embed the capability approach as the foundation of PETS. The areas shown in this figure are essential elements for operationalizing the capability approach. The first area of research focuses on the evolution of the basic capabilities that everyone should have. The second area of research aims to understand the individual the PETs intend to protect. However, to fully achieve both elements, there is a need for novel methods and measures of success. Research should identify new ways of recognizing basic capabilities and the metrics to qualify PETs as successful and fit for purpose. We discuss these areas in detail below.

4.1 Attention to Context in the Evolution of Basic Capabilities

Sen defines basic capability as the ability to satisfy certain crucially important functionings up to certain minimally adequate levels [seninequality]. An example of a basic capability in the context of social welfare in particular geographies is avoiding premature death. In the context of PETs this means the freedom to perform some basic things online. Our interpretation of basic capabilities is not as a definite list but as one that is highly contextual. The list can differ across populations with similar parameters of health, education, needs in different countries, while for populations with advanced needs, the list of basic capabilities can and will most likely be different. There are distinct groups ranging from migrants, to those living under oppressive regimes as well as citizens living in better conditions [guberek2018, gurses_pets_2010]. These political diversities when juxtaposed with gender, race, education and other factors can lead to a contextual granularity to a reasonable extent [schlesinger2017intersectional].

For this exposition, we refer to Solove’s taxonomy; this is to give a shape to what we propose as basic capabilities. Solove proposes four categories of harmful activities, namely (1) information collection, (2) information processing, (3) information dissemination, and (4) invasion [solove_privacy]. In the light of this taxonomy and the harms discussed within each of these four harmful activities, the list of basic capabilities can choose the related harms that will provoke interventions (PETs) to satisfy certain crucially important functionings up to certain minimally adequate levels. A basic capability example can be the ability to access state welfare mechanisms without being subjected to unauthorized disclosure. The harm of surveillance might apply to such individuals for example when inferences (accurate or not) are made about them to inform eligibility to access welfare [forte2017]. Furthermore, for differently abled citizens, the functioning will be further granulated based on their interface with technology. The list of harms can be complemented with assessment frameworks to elicit the threats latent in the interactions particular citizen groups have with systems [linden2020].

We pose the evolution of a situated list of basic capabilities as an open question for future research to consider. Prescribing a list would assume an antecedent uniformity. Furthermore, the process in which a list is evolved is very critical from the perspective of capability approach. Nonetheless, it is pertinent to draw attention to the debates among scholars working in social justice and welfare, for a definite list against evolving a context-dependent list [martha1988-NUSNFA-2, nussbaum_2000].

Agenda.  Future research can deliberate whether existing propositions like Solove’s taxonomy or the LINDDUN framework [linddun2018] are adequate or if a nuanced contextual list can evolve through broader participation. A starting point would be to explore the extent to which the existing recommendations are in synergy with the political and economic maturity of various geographies as well as their cultural and social histories [sencapability]. The calibration efforts can then follow.

4.2 Going beyond idealized users

However prevalent in the spheres of technology development, the term ‘user’ is problematic as it may unhelpfully gloss over diverse social realities and socio-technical relations. Crucial to the plural view of PETs is the need for more nuanced categories and vocabularies which are sensitive to vulnerable groups or any actor implicated in systems of data collection which, by their situations, may not fit nicely under the conventional rubric of users. A consideration of being inclusive to the diverse observed abilities, needs and circumstances would require accepting them as legitimate focal variables as opposed to naive assumptions of universality. For example, individuals who have different degrees of education to be not deceived by complex legal agreements. The plurality of focal variables means multiple conceptualizations of PETs, in terms of distribution, participation, abilities, and changing circumstances; at both the individual and collective level.

There is a need to expand the scope of action of PETs to attend to different socio-technical arrangements and human relations with technology. Developers ought to ask who will benefit from the enhanced privacy protections of a particular design, and who will not. Such assessments concern ethical questions of inclusivity, dignity, and justice [jacobchicago], which call for a focus on the disadvantaged, and often invisible, actors and their particular realities. Recognizing the heterogeneity of socio-technical relations will enable different groups to enjoy social good in a manner they can, neutralizing the limitations (if any) of their opportunities to do so to a reasonable extent. To that end, the development of PETs could profit from approaches in sociology such as intersectionality [schlesinger2017intersectional] and the well established practice of reflexivity within participatory design whereby personal biases, assumptions, motivations and institutional commitments can at least be made visible, if not recognized as potential design shortcomings [pihkala_reflexive]. We highlight a few productive efforts in this direction which have started to look at ways to attend to the specific security and privacy needs of high-risk or vulnerable individuals such as whistle blowers, protesters, and refugees [ermoshina_can_2017, simko_computer_2018] Caring for the privacy of both users and non-users will help recontextualize the role of PETs beyond the interface, for example, in response to surveillance technologies or the various layers of technical systems involving data collection, transport, and processing. The capability approach seeks to enable plural conceptualizations of citizens through assessing the diverse abilities of individuals to operate systems, process the risks based on their knowledge and sensory abilities, and interactions (or not) with technology, leading to wider exercising of the functioning of a private life.

Agenda.  Research should focus on a nuanced and systematic understanding of the diverse realities of the individuals that PETs intend to protect and the challenges faced by developers to acquire and act upon such understandings. More apt terminologies, beyond monolithic categories such as ‘users’, are needed to allude to the intended beneficiaries of PETs in all their diversity. On the other hand we recommend developers engage in self-audit, reflective practices aimed at making assumptions explicit.

4.3 Recognizing human agency

A key consideration of the capability approach is that individuals should be able to achieve the functioning of a private life in a manner they have a reason to value. This puts agency at the heart of functioning. Individuals reveal information to remote entities and trust them to prevent identification, exposure, and other threats to misuse of the information [sokhateabuse]. Fears of misuse of sensitive information can lead to reticence or lying [ramokopane2021] and individuals do act driven by morality, compassion, and less self-centered views of rationality. We argue for allowing self selection by individuals as a possible alternative to a supply side decision of what is good for them. For example, individuals can choose to share information for medical research provided they are explicitly beneficial and governed by morally appropriate authorities [nuffield]. In other applications, an individual may willingly subscribe to receive advertisements for certain products without that being an indication that that the individual does not value their privacy. A possible self-selection approach and a potential remedy against individuals having to lie (e.g., giving false email addresses) could be to allow them to have an anonymous account that cannot be traced back to them 555For this exposition, we refer to a recent initiatives by DuckDuckgo and Apple to allow users to hide their email addresses by redirecting emails based on preferences, and only those desired by the users will be delivered to themhttps://www.theverge.com/2021/7/20/22576352/duckduckgo-email-protection-privacy-trackers-apple-alternative; with the caution that this type of solution will still require users to trust an intermediary (i.e., Apple or DuckDuckgo).

While the GDPR Act 6(1)666https://gdpr-info.eu/art-6-gdpr/ states that the data subject has given consent to the processing of his or her personal data for one or more specific purposes, instances of violations 777https://ico.org.uk/media/action-weve-taken/enforcement-notices/2620027/emailmovers-limited-en.pdf brings to the fore the dangers of sharing more information than is required. We are not asserting that making such decisions are within the cognitive load and cognitive capacity of all individuals [solove2021, colnagocognitive2020]; we are recommending a more nuanced understanding and representation of the contexts and proportionality thereof. Merchants who violate regulations keep on collecting more data than is required, hiding behind complex consent controls or the opacity that separates end-users from merchants. Such understanding can potentially influence the implementation of the law in both letter and spirit and eliminate excessive data collection right at the point where individuals actively or passively interface with online systems. Van Der Linden et al. explores software developers’ attitudes towards the collection of data from their users. They find that developer’s attitudes are not guided by the established principles of being ‘adequate, relevant’ and ‘limited’ to the purpose for which the data is collected [linden2019]. This conclusion is being arrived at by the authors while evaluating against specific regulations which might or might not be in sync with what citizens would prefer.

Agenda.  We recommend further research to empirically understand individual choices to give an evaluative understanding of the interactions citizens have a reason to value. Our recommendation is for a rigorous understanding of people’s choices, intentions, values, and motivations, irrespective of what developers/regulators think are good for citizens. This evaluation can feed into negotiating the proportionality of information disclosure particular to contexts and inform regulations/systems.

4.4 An assessment of power dynamics

We argue that a disciplined assessment of the deprivations, valued interactions of active individuals should form the basis of PETs. The exercise is not self-contained but largely dependent on what the politically powerful forces would be willing to concede. There are experiences from provisioning public goods in their disproportionate use and availability among the population. The ability to appropriate operates at many layers.

In September 2019, the Court of Justice of the European Union (CJEU) ruled in two cases (C-136/17) and (C-507/17) [globocnik2020]. In the former, while the court made an implicit acknowledgement of the right to be forgotten, in the latter, the same court limited the territorial scope of the same right. Since CJEU nudged the lawmakers to consider expanding the territorial limits of GDPR, the way forward is driving public opinion for the lawmakers to take it up with their counterparts in other jurisdictions. Google is a profit-making enterprise making use of and profiting from the information they store about individuals. A pertinent question thus relates to the prudence of entrusting Google to decide which information is in the public interest and which is not. The rise of the data economy has put corporations in a problematic situation when it comes to accommodating public interest if that threatens to impact profits [zuboff_age_2019].

The other issue concerns the ability among the various groups to use a public service when it is available. There are several factors that engender the widespread uptake of such services. However, a significant contributor to ability is the awareness among citizens of their rights and recourse to violations. The experience is not encouraging among vulnerable sections of society for access to justice in general [gill2021], and when there is access, the battle is far too long and draining888https://www.postofficetrial.com. The information asymmetry does not exist by itself but sometimes by bureaucratic design [hood1995]. A strength of the capability approach is that along with the explicit consideration for human diversities; it actively factors in political realities as a critical conversion factor for individuals to lead the life they value.

Agenda.  The political economy of privacy protection foregrounds not just the heterogeneity of individual abilities and needs, but questions around which regulatory interventions and political supports are needed to further the technical goals of PETs. This calls for a nuanced observation of the presence of deliberate interferences (if any) that can cloud various claims of freedom by the powers of the day. In clear terms, an empirical observation of the adherence to stated laws by those entrusted with implementing the law and also of those who are supposed to enjoy the protection of the law. The realized measure of liberty can feed into provisioning of PETs for various groups.

4.5 Diversity of methods and measures of success

A shift from the supply side view of what citizens need to plural conceptualizations of individuals will bring in cogency, make their participation in online activities enjoyable and valuable. The method one adopts to realize the research agenda is crucial to the success of embedding the capability approach as a foundation of PETs.

How to prepare the list of basic capabilities?

The process by which the list of basic capabilities and interpersonal comparisons will evolve is crucial for the capability approach. Such a list is significant for policy evaluations or measurements related to privacy (or lack thereof). The legitimacy of the list is critical in effecting PETs as a means of social justice and democracy. Sen explicitly recommends a debate and democratic participation. Selection will be an inescapable part of this process; which would mean catering to the needs of particularly vulnerable groups, in terms of ability and/or education and environment. Contemporary political philosophers have been engaged with the issues concerning selection in other contexts; we refer to the work of Robeyns for exposition [robeyns2003]; however, we are not rigid about a particular set of criteria. The criterion of explicit formulation and the criterion of methodological justification requires that the selected capabilities should be defensible on both these counts. The list is required to be sensitive to the context of the target group. The criterion of generality specifies that the list should be evolved in two stages. First, a general list and second, a fine grained list will be drafted enumerating all the capabilities a citizen should have. This list will be refined based on local conditions based on data and empirical research. It is important that selected capabilities might only have negligible overlaps with others to satisfy the criterion of non-reducability [robeyns2003].

Methods to include individuals with diverse abilities and situations

Though human-centered design (HCD) has dominated conversations as an approach to promote increased adoption and use of systems, this has been often reduced to user studies and consultation [buchanan2001]. Moreover, there is a limit to what developers can learn from their users, given several social, material, and political constraints [stewart_wrong_2005]. This owes, among other factors, to varying degrees of ignorance and technical literacy, issues of accessibility (cognition, location, vulnerability, language, information overloads), and the presence of vast information asymmetries between users and highly opaque information systems. Without underestimating the importance of HCD, the capability approach entails much more than the notion of utility implicit in usability [oosterlaken2009]. There is a moral obligation of PETs to cater to those whose “body and mind” do not fit the conventional construction of a user. Observed diversities and realities are as crucial as those un-observable diversities and realities. We borrow the term unknown known from [rashid2016] to emphasize the emergent and continuously evolving nature of individuals and the environment. Focus groups, interviews, and other participatory research methods have proved highly productive, yet they need to be complemented with fresh approaches to understand implicated actors and evolving environmental realities. For example, Albrecht et al. conducted ethnographic research with 11 protesters from Hong Kong to understand the improvisations and unusual tactics protesters resorted to in order to avoid state surveillance [albrecht2021]. Schlesinger et al. introduce the sociological framework of ‘intersectionality’ in HCI to understand the complex and sometimes fluid identities of individuals. While they acknowledge the progress made in unpacking questions of identity, they also point out to the gaps in relation to multiple forms of exclusion and oppression based on gender, race or class [schlesinger2017intersectional]. Such conceptual and methodological frameworks can feed into the normative evaluations of the conversion factors of individuals to achieve the functioning of a private life in similar situations.

How to measure success of adopting capability approach?

While we depart from the comparison of welfare based on possession of resources, the critical question is how do we propose to evaluate PETs built using the capability approach. Conventionally, technologies have been mainly measured in terms of their adoption or acceptability which cannot always account for unexpected uses and reactions, or the effectiveness of the technology to live up to its promises. Future research can delve into the metrics and assessments which are not merely techno-centric but can more adequately reflect how citizens are able to exercise their functionings and enjoy their right to privacy. Functionings can be observed not only quantitatively but qualitatively, for example, if a journalist living under an oppressive regime can exercise her right to a private life without oppression. Measures of success should factor in diversity of beneficiaries, such as race, age, ethnicity, gender, sexuality, social and political realities, physical handicap, mental health, pregnancy, or have caring responsibilities. While factoring in diversity, adequate care should be taken to limit the discrimination among users and exclusion of non-users.

5 Discussion:

In 2019, when deciding on whether a person has the capacity to decide on the internet and social media use, a judge in the United Kingdom observed999https://www.bailii.org/ew/cases/EWCOP/2019/3.html: ‘I do not envisage that the precise details or mechanisms of the privacy settings need to be understood, but P should be capable of understanding that they exist and be able to decide (with support) whether to apply them.’ This statement is instructive in how an evaluation of the mental and physical ability of the applicants is critical in understanding whether an individual (i.e., P) would be able to use the privacy mechanisms provided. A closer look at their personal and social circumstances would give valuable insights to make such an evaluation. In this case, the applicant was neither physically nor mentally capable of using social media privacy settings correctly, though the functioning of safe and private browsing is vital to the individual. The central question from the perspective of justice and equal opportunities that arise in this context is ‘is it possible for an individual with special needs to understand and achieve the safe browsing functioning?’.

There are other instances where users with different levels of expertise failed to appropriate their functioning of a private and safe life online because the system did not take into account their situations. For example, while commenting on the increase of fraud against elderly citizens, the Aspen Tech Policy Hub stated that ‘the elderly lack access to training on newer technologies, suffer anxiety over breaking new equipment, and a potential decline in sensory abilities such as eyesight 101010https://www.aspentechpolicyhub.org/project/protecting-older-users-online/.’ Moving on from individuals to the environment, the cases of abuse of human rights and freedom of the press in various parts of the world are well documented111111https://commonslibrary.parliament.uk/research-briefings/cdp-2020-0063/. Citizens in some parts of the world, even when equipped with the resources (e.g., devices and the Internet), are not able to exercise their functioning of private life and freedom of speech. Citizens live in an environment where they are profiled and watched without their knowledge or consent, even when they do not directly interact with technology.

Using the capability approach to evaluate the individuals implicated in the previous examples would have highlighted various issues vital for the design and development of PETs. For example, for the less abled individual, a capability approach based evaluation would have highlighted the special assistance needs while using PETs. Moreover, systems could have been developed considering individuals who will need such assistance while respecting the privacy of those in need. In the context of the elderly victims of fraud, the capability approach would have underlined that they would find it challenging to learn new technologies; so equipping them with technology alone will not enable them to achieve the functioning of a private life. For residents of regions subjected to human rights abuse, a capability approach based evaluation would have precisely pointed to the information that even with the ability and wherewithal to use technology, they will not be able to achieve the functioning of a private life. In their circumstances, political resolution may enable them to achieve the functioning of private life.

On a more general note, these examples can further enforce the significance of the research agenda we set out earlier. A context sensitive list of basic capabilities would have given the basic functionings for the specially abled individual, the elderly as well as individuals living under oppressive regime. Then the development of PETs to serve the list would definitely need to go beyond the conventional notion of users for each of the cases we cite here giving explicit considerations to the opportunities they have depending on their social and political realities.

6 Conclusion

In our view of using the capability approach as a foundation of building PETs, we recognize the moral obligation of PETs to human agency and diversity. This view attends to the need to cater for individuals in all their complexity and advocates for accountability and transparency in the process of development. Much in line with Lucy Suchman’s critique to ‘designing from nowhere’ [suchman_located_2002], we advance that privacy-enhancing technologies should be sensitive to the plural realities of users and non-users of technology and their diverse needs, locales, preferences, abilities, and social and environmental conditions. This is achieved by foregrounding developers’ commitments and preconceptions and including all relevant constituencies in the deliberation and design of ways to protect privacy. It is important to acknowledge that, as with any human-centered approach, the ability to garner privacy requirements is encumbered by limits in the knowledge possessed by users and their actual means to inform designers on what is needed [stewart_wrong_2005]. Overlooking these constraints could be detrimental if it creates a false sense of certainty. As recent studies on surveillance have shown, privacy violations linked with algorithmic behavior prediction can occur completely unbeknownst to users (and non-users) even when good legal privacy provisions and technical measures are in place [miller_total_2014]. Increasing attention has been given to the ethical issues associated with the datafication of human activity and its use for statistical inference and prediction of future behaviour [muhlhoff_predictive_2020]. This underpins our emphasis on practicing reflexivity in development and surfacing power dynamics and systemic issues. Opacity not only makes locating responsibilities difficult but unavoidably creates a situation of unevenly distributed costs and benefits. Our proposal is a call for a holistic view of citizens in their environment, which aims to expand the repertoire of empirical methods to inform technical and policy interventions.

References