1 Introduction
The General Data Protection Regulation (GDPR) is a new European Regulation that will be applied from 2018.
The current EDPD and the future GDPR do not stand on their own, but are part of a legal system of privacy protection that is applicable to all Member States of the European Union. This means there are more rules applicable to privacy protection than the EDPD or GDPR. Privacy is actually one of the oldest human rights in Europe, and is protected by many human rights treaties.
The key principles of data privacy and protection are the same for the EDPD and the GDPR,
Since the creation of the EDPD in 1995, many new technological developments have taken place. Think for example about the creation of Google (1998),
First, the term ‘Quantified Self’ (QS) is explored further. Then, in section three, the meaning of freedom, autonomy and privacy are discussed. Section four discusses the elements of the GDPR that are of specific importance for QS. Section five discusses how QS affects freedom and whether or not the GDPR can help to protect that freedom. Finally, section six concludes that the GDPR has some positive influences on the parts of QS that affect our freedom. However, this is limited to external freedom. The GDPR does not appear to touch upon internal freedom.
2 Quantified Self
Quantified Self as an often-used term to describe a societal movement, was introduced in 2007 by Kelly and Wolf.
QS can be related to the situation in which people record data about themselves.
The data in a QS-device can either be collected automatically or must be fed by the user. Automatic collection is done for example by a footstep-tracker, that counts and collects every step you take, whereas when you keep track of your diet, you have to manually enter what you eat. QS uses various methods to ‘quantify’ a person’s approach to life, health, mood, locations, personal goals (such as sport) and more.
The idea that persons collect data about themselves is not new. The notion of a societal QS movement can be seen rather as a new set of terminology (QS, self-tracking, etc.) for a habit that has been common for a long time: tracking and collecting information about oneself.
3 Freedom, Autonomy, and Privacy
Every society needs rules that help to protect the individual’s freedom.
Privacy is a difficult concept, without one clear meaning.
Kant therefore clearly links freedom with autonomy. As well as freedom and autonomy, freedom and privacy are also linked to each other. Westin described that “each individual is continually engaged in a personal adjustment process in which he balances the desire for privacy with the desire for disclosure and communication”.
Now we know what the scope and meaning of the central notions in this article are. All of these elements refer back to the overlapping notion of human dignity. The next parts discuss how QS can negatively influence freedom and therefore interfere with your life. The term ‘freedom’ is used, but this refers to both freedom and autonomy, and is closely linked to privacy. Later in the article, a distinction is made between external and internal freedom. This will be explained further in section 5. However, first it is time to have a more detailed look at the GDPR.
4 The GDPR and QS
The GDPR regulates the “protection of natural persons with regard to the processing of personal data and rules relating to the free movement of personal data”.
The GDPR applies to the processing of personal data,
Relevant for QS is article 3 of the GDPR, which states that it is not important whether or not the actual processing takes place in the Union, when its context is “the context of the activities of an establishment of a controller or a processor in the Union”.
Another important element of the GDPR is that the processing of personal data is only allowed if a number of conditions are met.
Two elements of the GDPR may cause problems for QS-tools. Firstly, there are two problems related to the ‘consent’ that is asked for in the GDPR. The first is related to the prohibition concerning the processing of “data concerning health”.
Because of the specific nature of (most) QS-tools, that do collect these types of data related to health, this can be problematic. As in the example of Strava, where data about the pulse and physiological state of individuals are shared. Although there is a basic prohibition of the processing of data related to health, there are some exceptions. These are similar in the GDPR and the EDPD.
Secondly, another problem with QS and the GDPR is related to the fact that data collected in a QS-application, cannot be used for other purposes. The GDPR requires that personal data is collected for specific, explicit, and legitimate purposes. Moreover, the data collected should not be processed in a way that is incompatible with these purposes.
5 QS, Freedom, and the GDPR
This section will show first of all how the QS influences freedom. QS does not usually stop after having collected personal data. These data will be processed in order to be able to say something about them. Therefore, personal data will be processed and conclusions will be drawn from these statistics. The different steps that are normally used to process data collected with QS are described. These steps are: comparison, creation of group norms, standard-setting (internal restriction of freedom), and judgments on the basis of data (external restriction of freedom). Secondly, in every step, it is shown how the GDPR can or cannot help to protect an individual’s freedom in that specific regard. The GDPR aims to protect the personal data and privacy of individuals, but is this enough to protect their freedom? Of course, not all data that are collected with QS are processed in exactly the same way. However, the statements made in this section are applicable to most data.
5.1 Comparison in Three Ways
There are different possibilities on how to compare the processed data. First, a person’s data can be compared to the data of other users. This can be online on a forum, through a community, or in the app itself, when the data collected is compared to data of other users.
Since most QS-users do so to improve some aspects of their life (eat healthier, become fitter, run faster),
5.1.1 Comparison and the GDPR
The GDPR has a few rules on these categories. The GDPR obliges processors to be pseudonymised in order to be processed lawfully.
Regarding the creation of the categories, it is clear that QS-data of a specific individual will be used for comparison with the data of other users. This does not only entail that every individual receives information about other users, but also that all data are compared to the data of other users. It is questionable whether the specific consent of users is asked to do so, and therefore it is not clear whether it is in line with the GDPR to process data of users for this purpose. For example, in the privacy statement of MoodPanda, it is made clear that information is used to “measure and improve” their services over time.
5.2 Group Norms
The yardstick that is used to compare data to a group is an interesting point to discuss further. Social norms are present in every society or group.
Most of these users aim at improvement of a certain condition. Therefore, these people will produce data that differ from the average data if you were to look at an entire society, or at least a randomised group of people: people who want to lose weight, typically eat less than other people and people who want to run a marathon will train harder than average. So, people do not only share their data, but also (implicitly or explicitly) their values and goals.
In a group of QS users, different things can happen when data are shared. Here we first have to make a distinction between QS-tools that make it possible to decide whether or not to share a certain achievement and apps that do not give that option.
Secondly, it can be that people have to share all their data. This will lead to a more balanced reality, since it does not only show people’s topscores, but also their off-days. This might make it easier to begin using an app, since everyone has been a starter, and those data are available as well.
So, there are roughly two main options: people will either become more motivated, or more likely feel disappointed because they cannot live up to the group standard. These group norms will therefore at least be an influential factor in one’s decisions. From now on, this article will focus on persons that try to follow the group norms.
5.2.1 Group norms and the GDPR
It is difficult to link the creation of group norms as described above to the GDPR. In my opinion, this is ultimately related to profiling or self-profiling. Profiling is “the process of discovering correlations between data in databases that can be used to […] identify a subject as a member of a group or category”.
5.3 Internal Restriction of Freedom
5.3.1 Following the Standard
As discussed above, group norms can be created by a QS app. But how can these norms influence people’s behaviour? Apart from the hereinafter mentioned research, not much practical evidence exists on the restrictions to internal freedom. However, this does not mean that it cannot be an issue. There are strong arguments, which are put forward in this section, to suggest that internal freedom can be restricted because of QS. Research has shown that people look to others to guide their actions.
There are different reasons why these standards influence people so heavily. Firstly, people can feel unconfident about what to do, and therefore use others to determine what the right thing is to do. Secondly, these norms can tell people how they can fit in with the majority in order to become accepted.
Due to uncertainty, people are likely to follow a new standard that has been set both by the users of the QS tool, or by the QS-tool itself. This is because, as explained earlier, the QS-tool uses data-analysis to compare one’s data on three different levels: previous scores, other users, and a person’s goal. So the norm is not only set by what other users do or should do, but also by what you do or should do, based on the feedback you receive from the QS-analysis.
This new standard that an individual feels he or she should follow, changes his or her reference framework. This can occur without the person really being aware of it. Because only specific persons use an app, the standard given in such an app, is not ‘the average’. This can lead to a tunnelvision, in which a user may think that his or her scores are not good enough, although they are in fact much higher than those of most of the other people. This standard that an individual imposes on him- or herself, can thus be unrealistic and far too high. By following this standard, an individual can limit his or her own freedom and autonomy rigorously. Lupton has for example described the challenges related to eating healthy.
Although this obviously can restrict a person’s freedom, it is still a choice to become subject to the discipline of QS. In this way, it is possible to say that the QS app maybe limits your internal freedom to behave as you want to – but you are still free to choose not to do so. In addition, all of the internal freedom limitations are technological features which can be opted out of and which have always existed in other forms. An example of this is the social pressure an individual can feel to conform to beauty standards. This pressure already existed before the technologies that form QS were created. QS is comparable with beauty standards. It is an even more powerful and strong standard that can restrict an individual’s internal freedom. This is not only because QS provides for an opportunity to collect more (all) data about a certain issue or a certain individual, which makes it much more personal and potentially more infringing on the status of a certain individual. There is a great difference between a beauty standard that exists and is equal for everyone, and a personalised standard that counts everything an individual does. QS cannot be ‘fooled’ because of technological progress. This is related to the difference between a human coach and a phone with QS, which is mentioned in section 5.3.2, and also in section 2. Moreover, QS is less visible, more hidden, and even more merciless than other social structures that can limit an individual’s internal freedom.
5.3.2 Internal Freedom and the GDPR
This form of restriction is already known in literature (see e.g. literature about behavioural targeting
It can be questioned whether the purpose of achieving one’s own goal includes comparison with other user’s achievements (and re-use of your data to determine the standards for others). Accordingly, it is questionable whether or not the consent given to the QS app can fulfil the demands of the GDPR. How could you ever consent to have your internal freedom restricted, if you cannot know the underlying logic that the app uses to create the standards that you set for yourself? This is all the more worrying if the risks for physical and emotional well-being are taken into account; not being able to meet your own standard is not particularly helpful for you self-confidence.
This is something that is important to be aware of. It might be necessary to create a debate about the question of whether the GDPR should protect this type of internal freedom in order to protect individuals from harming their physical and/or mental health by trying to achieve unrealistic goals. This is very different with an online QS tool than for a human coach. A coach can bring a human factor to the evaluation by showing empathy and understanding.
5.4 External Restriction of Freedom
5.4.1 Judged on Your Data
One step further than restricting an individual’s freedom to choose, is the situation in which other people judge him or her on the basis of a QS-profile. This means that others restrict a person’s freedom (external restriction of freedom), because they limit his or her options based on the data they have. One can think about very different situations here. One is a situation in which friends or members of a group judge an individual on the basis of his or her achievements in a QS-tool. This can be very harmful for a person, especially when it influences a person’s social status.
In addition, it may be possible that companies or other private parties judge persons on the basis of their data. One example is insurance companies that offer a discount for healthy living when you share your data. Another example is a typical American phenomenon called workplace wellness programmes. These programmes seek to “help employees improve their health and fitness levels, often by offering incentives to employees who participate in various program activities or achieve certain health-related goals”.
The fact that this does not happen, does not mean that other companies cannot judge individuals on their data in Europe. Some of the data collected via QS can be extremely interesting for marketeers and companies.
Finally, not only companies but also governments might be very interested in data. This does not mean that governments can use the QS, but they may access QS technologies for their own purposes or access data collected with QS. Although this is probably unimaginable in Europe, in China, the Communist Party is working on a so-called ‘social-credit system’.
All of these examples show that there are many parties that have an interest in personal data, especially the more sensitive or private data that is targeted specifically by QS-tools. These data that are collected by individual users are analysed. This allows interested parties to find specific patterns or correlations between different datasets, or to deduce new information from the data.
Moreover, it can be the case that persons get certain (dis)advantages on the basis of their data. Think about a discount for health insurance when living healthy, or a higher price when a change in diet proves unsuccessful. Imagine your health insurance provider sending you the following messages: “You have exceeded your fats quota this week; you don’t adhere to your dietary goals: your insurance premium will rise”.
Even targeted advertising can affect the choices that we make. Therefore, when a company sends targeted advertisements, our free choice is affected by the fact that a company has assessed us on the basis of our data.
These risks described above are not hypothetical at all. Different authors have warned about the privacy challenges of self-trackers.
5.4.2 External Restriction of Freedom and the GDPR
Many of the problems related to the limitations of external freedom are already mentioned in the literature. All of the problems mentioned here are related to QS. However, these have not all been linked to QS already before.
The first problem is that data in QS-tools are protected very poorly. This makes it highly possible that others can look at your data. A reason for this is that QS-apps work with low-cost data collection and communication systems. Security measures should therefore also be minimal and cheap.
Also with companies and third parties that have (lawful) access to the data gathered via QS, there can be situations in which data is processed in a way users did not directly consent to. A revealing example is a situation that occurred in 2012, when the New York Times showed that a retail chain had used data mining and processing techniques to predict which female customers were pregnant, even when they had not yet announced publicly that they were pregnant (in fact, some of them were not even yet aware of it).
A third problem is that many QS-apps collect information about health. Although data directly related to health are protected under a stricter regime, data about behaviour in general, such as showing interest in certain information, or abilities to run etc, can also reveal a lot of information about one’s health.
Fourthly, there is the risk that companies, based on an existing set of data, try to predict an individual’s future behaviour. This can happen via a predictive model that predicts the possible future data, and thus reacts on not-yet-existing data. This is an external limitation of freedom in its most extreme form. Crawford and Schultz state, “these privacy problems go beyond just increasing the amount and scope of potentially private information. Based on existing publicly available information, Big Data’s processes can generate a predictive model of what has a high probability of being [personally identifiable information], essentially imagining an individual’s data”.
Other problems related to external freedom are not solely applicable to QS. These problems are not fundamentally different when looking at QS, but apply to all data and privacy challenges. First of all, it is questionable whether it is even possible to give consent for the processing of your data, when it is not clear for what purpose this consent is given exactly, as the GDPR asks.
Although processors are obliged to anonymise the data they sell (e.g. through pseudonymisation, encryption or key-coding),
6 Conclusion
In conclusion, it is not difficult to see that the QS can influence one’s freedom and autonomy. Some aspects of the GDPR are positive in this light: the GDPR is also applicable to apps offered outside of Europe, the GDPR contains an obligation to ask for consent before processing data, and specific consent is required for the processing of health-related data. Finally, personal data can only be collected for specific purposes. However, many of the problems regarding external freedom that are mentioned in the article, are not addressed by the GDPR. Much less familiar but at least as problematic as external freedom is the restriction of internal freedom. Although this is really problematic, the GDPR does not touch upon any of these aspects, which makes the GDPR unable to protect internal freedom as influenced by QS. This article adds to the debate on internal freedom. Not much information is available on this topic, beyond the potential uses of the technologies. This in comparison to external freedom that has been explored in a plethora of works in greater depth. More research on the issue of limitations on internal freedom related to new technological developments, such as the ones that underpin QS, is therefore required.
* The author would like to thank her supervisor Dr. M. van der Linden-Smith for all the valuable feedback, support and interesting conversations. Additionally, the author would like to thank the anonymous reviewers for their feedback.
[1] Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) [2016] OJ L 119/1 (hereinafter ‘GDPR’).
[2] Directive 95/46/EC on the protection of individuals with regard to the processing of personal data and on the free movement of such data [1995] OJ L 281 (hereinafter ‘EDPD’).
[3] See “GDPR Portal: Site Overview” available at http://www.eugdpr.org/ (accessed 5 December 2017).
[4] This will be on 25 May 2018.
[5] GDPR, rec. 7.
[6] Ibid., rec. 11.
[7] Ibid., rec. 10.
[8] Francesca Bignami, “Privacy and Law Enforcement in the European Union: The Data Retention Directive” (2007) 8(1) Chicago Journal of International Law 233 – 255, p. 233.
[9] According to the EU’s Treaty of Lisbon, the EU is required to accede the ECHR (Consolidated Version of the Treaty on the Functioning of the European Union (TFEU) [2010] OJ C 83/47 art. 6). However, on 18 December 2014, the Court of Justice issued a negative opinion on the accession of the EU to the ECHR (Opinion 2/13 [2014]).
[10] Supra n. 8, pp. 241-242.
[11] TFEU supra n. 9, art. 16.
[12] International Covenant on Civil and Political Rights Adopted and opened for signature, ratification and accession by General Assembly resolution 2200A (XXI) of 16 December 1966 entry into force 23 March 1976, in accordance with Article 49.
[13] Francesca Bignami, “Transgovernmental Networks vs. Democracy: The Case of the European Information Privacy Network” (2005) 26(565) The Michigan Journal of International Law 807-870, pp. 813-819.
[14] EDPD, rec. 10.
[15] Gerrit Hornung, “A General Data Protection Regulation for Europe? Light and Shade in the Commission’s Draft of 25 January 2012” (2012) 9(1) SCRIPTed 64-81.
[16] Interinstitutional File 2012/0011 (COD), Presidency to the Council, 11 June 2015, 9565/15, Proposal for a Regulation of the European Parliament and of the Council on the protection of individuals with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation) – Preparation of a general approach.
[17] See “GDPR Key Changes” available at http://www.eugdpr.org/key-changes.html (accessed 5 December 2017).
[18] Google was founded on 4 September 1998 by Larry Page and Sergey Brin.
[19] Facebook was launched on 4 February 2004 by Mark Zuckerberg and Eduardo Saverin.
[20] In 1999, six companies together created the Wireless Ethernet Compatibility Alliance. They decided to call their body Wi-Fi, and thus Wi-Fi was created in 1999. For more information, see: The Economist, “A brief history of Wi-Fi” (2004) The Economist, Technology Quarterly Q2, available at http://www.economist.com/node/2724397 (accessed 5 December 2017).
[21] Dominik Leibenger, Frederik Möllers, Anna Petrlic, Ronald Petrlic and Christoph Sorge, “Privacy Challenges in the Quantified Self Movement – An EU Perspective” (2016) 4 Proceedings on Privacy Enhancing Technologies 315-334.
[22] Ibid.
[23] Gary Wolf, “What is the Quantified Self?” [2011] available at http://quantifiedself.com/2011/03/what-is-the-quantified-self/ (accessed 5 December 2017). The Quantified Self Company provides users a community and it organises amongst other things meetings, conferences, forums, web content, and a guide.
[24] See “Quantified Self: self knowledge through numbers” available at http://quantifiedself.com (accessed 5 December 2017).
[25] Deborah Lupton, “Understanding the Human Machine” (2013) 32(4) IEEE Technology and Society Magazine 25-30, p. 25.
[26] Mario Ballano Barcena, Candid Wueest, and Hon Lau, “How Safe is Your Quantified Self?” (2014) Technical Report Symantec 1-38.
[27] Margreet Riphagen et al., “Learning Tomorrow: Visualising Student and Staff’s Daily Activities and Reflect on it” (2013) ICERI 2013 conference proceedings, p. 1.
[28] Melanie Swan, “The Quantified Self: Fundamental Disruption in Big Data Science and Biological Discovery” (2013) 1(2) Big Data 85-99, p. 85.
[29] Minna Ruckenstein and Mika Pantzar, “Beyond the Quantified Self: Thematic
Exploration of a Dataistic Paradigm” (2015) 19(3) New Media & Society 1-18, p. 3.
[30] Supra n. 27, p. 2.
[31] Supra n. 28, p. 85.
[32] Supra n. 27, p. 3.
[33] See “track your mood & get anonymous support” available at http://moodpanda.com (accessed 5 December 2017).
[34] See “Everyone. Every run” available at https://runkeeper.com/index (accessed 5 December 2017).
[35] See “Weight loss that fits” available at: https://www.loseit.com/ (accessed 5 December 2017).
[36] Supra n. 28, p. 85; supra n. 29, p. 2.
[37] See “Strava” available at https://www.strava.com/ (accessed 5 December 2017).
[38] See for an overview of all possibilities of Strava: “Features” available at https://www.strava.com/features (accessed 5 December 2017).
[39] Ibid.
[40] See Mike Wehner, “Strava Begins Selling Our Data Points, and No, You Can’t Opt-out” [2014] engadget, available at https://www.engadget.com/2014/05/23/strava-begins-selling-your-data-points-in-the-hopes-of-creating/ (accessed 5 December 2017).
[41] Alan Westin, Privacy and Freedom (London: The Bodley Head, 1967) p. 23.
[42] Edward Eberle, “Human Dignity, Privacy, and Personality in German and American Constitutional Law” (1997) 4 Utah Law Review 963-1056, p. 964.
[43] Ibid, p. 965. This is also very clearly a Kantian idea.
[44] Daniel Solove, Understanding Privacy (Cambridge: Harvard University Press, 2008) p. 1.
[45] Supra n. 41, p. 7.
[46] Auto (αὐτο) means self and nomos (νόμος) means law.
[47] Paul Guyer (ed), The Cambridge Companion to Kant and Modern Philosophy (Cambridge: Cambridge University Press, 2006) p. 345.
[48] Rudolf Steiner, The Philosophy of Freedom, (Rudolf Hoernlé tr, Susses: Rudolf Steiner Press, 1916) p. 40.
[49] Ibid.
[50] Robert Johnson and Adam Cureton, “Kant’s Moral Philosophy” [2016] Stanford Encyclopedia of Philosophy, available at https://plato.stanford.edu/entries/kant-moral/ (accessed 5 December 2017).
[51] Immanuel Kant, Groundwork of the Metaphysics of Morals (Thomas Abbott tr, London: Longmans, Green and co, 1895) at 6:214.
[52] Immanuel Kant, Kritik der Praktischen Vernunft (Riga: Hartknoch, 1788) ch 1.
[53] Supra n. 51.
[54] Ibid., at 4:421.
[55] Ibid., at 4:429.
[56] Ibid., at 4:431.
[57] Supra n. 41, p. 7.
[58] Supra n. 44, p. 2.
[59] GDPR, art. 1(1).
[60] Ibid., art. 4(1).
[61] Ibid.
[62] Ibid., rec. 26.
[63] Ibid., art. 2(1).
[64] Ibid., art. 2(2)(c).
[65] Ibid., art. 3(1).
[66] Supra n. 21, p. 318.
[67] GDPR, art. 6.
[68] EDPD, art. 7.
[69] GDPR, art. 6.
[70] Claudia Quelle, “Not Just User Control in the General Data Protection Regulation. On Controller Responsibility and How to Evaluate Its Suitability to Achieve Fundamental Rights Protection”, in Anja Lehmann, Dianne Whitehouse, Simone Fischer Hübner, Lothar Fritsch and Charles Raab (eds) Privacy and Identity Management. Facing up to Next Steps (IFIP Summer School 2016, Berlin: Heidelberg, 2017) p. 4.
[71] GDPR, art. 4(11).
[72] See: GDPR, Recital 32.
[73] Supra n. 70, p. 4.
[74] GDPR, art. 7(4).
[75] GDPR, Recital 43; supra n. 70, p. 4.
[76] For example the ePrivacy Directive does not require unambiguous consent at the moment, because the EDPD does not define consent as unambiguously in its definition of consent.
[77] See: GDPR, art. 8.
[78] See: Ibid., art. 9, with exceptions in 9(2)(a) and 9(2)(e).
[79] Ibid., art. 4(15).
[80] Supra n. 21, p. 318.
[81] GDPR, art. 9(2)(a).
[82] Supra n. 21, p. 318.
[83] Ibid., p. 317.
[84] Ibid.
[85] See: GDPR, art. 5(1)(b).
[86] Often, this will be an individualised comparison: for example, only the running speed of other female users between 20 and 25 will be compared with your data.
[87] Supra n. 26, p 10.
[88] Katleen Gabriels, “I Keep a Close Watch on this Child of Mine” (2016) 18(3) Ethics and Information Technology 175-184, p. 175.
[89] Edwin Locke and Gary Latham, “The Application of Goal Setting to Sports” (1985) 7 Journal of Sport Psychology 205-222, p. 206.
[90] Supra n. 26.
[91] GDPR, art. 6.
[92] See for example Harald Gjermundrød, Ioanna Dionysiou, and Kyriakos Costa, “PrivacyTracker: A Privacy-by-Design GDPR-Compliant Framework with Verifiable Data Traceability Controls” in Sven Casteleyn, Peter Dolog and Cesare Pautasso (eds) Current Trends in Web Engineering, (ICWE 2016 Workshops, Berlin: Springer International Publishing, 2016) pp. 3-15.
[93] See “MoodPanda Privacy Policy” available at http://moodpanda.com/privacy.aspx (accessed 5 December 2017).
[94] See “RunKeeper Privacy Policy” available at https://runkeeper.com/privacypolicy (accessed 5 December 2017).
[95] Ibid.
[96] Ibid. The only thing that is mentioned about other users is “to enable social-sharing, to find your friends, […] to allow you to communicate and interact with other users”.
[97] Supra n. 41, p. 13.
[98] Daniel Feldman, “The Development and Enforcement of Group Norms”, (1984) 9(1) The Academy of Management Review 47-53, p. 47; Richard Hackman, “Group influences on individuals” in Marvin Dunnette (ed) Handbook of Industrial and Organizational Psychology, (Chicago: Rand McNally, 1976) pp. 1455-1525.
[99] Supra n. 26, p. 6.
[100] Supra n. 25, p. 27-28.
[101] Ibid.
[102] For example in Strava, you can view your performance after running or cycling and then choose whether or not you want to share this with your friends and followers.
[103] See for example supra n. 21, p. 316.
[104] Supra n. 89, p. 207.
[105] Ibid.
[106] Ibid.
[107] Supra n. 98, p. 49. For more information, see Daniel Katz and Robert Kahn, The Social Psychology of Organizations (New York: Wiley, 1978).
[108] Bart Schermer, “The Limits of Privacy in Automated Profiling and Data Mining” (2011) 27(1) Computer Law & Security Review, 45-52, p. 45.
[109] Supra n. 15, p. 69.
[110] Solomon Asch, “Effects of Group Pressures Upon the Modification and Distortion of Judgments” in Greg Swanson, Theodore Newcomb, and Edward Hartley (eds.) Readings in Social Psychology, (New York: Holt, Reinhart & Winston, 1952) pp. 393-401; John Turner, Social Influence, (Milton Keynes: University Open Press, 1991).
[111] Matthew Hornsey Louise Majkut, Deborah Terry and Blake McKimmie, “On Being Loud and Proud: Non-Conformity and Counter-Conformity to Group Norms” (2003) 42(3) The British Psychological Society 319-335.
[112] Ibid, p. 320.
[113] See for a good example the research done by Deutsch and Gerard in 1955, where participants were required to judge the length of two lines. Some respondents were instructed to give the wrong answer. The study suggested that the pressure to comply with the majority was very high for participants who were not aware of the fact that some respondents were instructed to do so (see: Morton Deutsch and Harold Gerard, “A Study of Normative and Informational Social Influences Upon Individual Judgment” (1955) 51(3) Journal of Abnormal and Social Psychology 629-636). Various other researches have shown that people are not willing to speak out to the majority in general.
[114] Deborah Lupton, “Food, Risk and Subjectivity” in Simon Johnson Williams, Jonathan Gabe and Michael Calnan (eds) Health, Medicine, and Society. Key Theories, Future Agendas (London: Routledge, 2000) pp. 205-217.
[115] Cristian Rangel, Steven Dukeshire and Letitia MacDonald, “Diet and Anxiety. An Exploration into the Orthorexic Society” (2012) 58(1) Appetite 124-132, p. 124.
[116] Guido Nicolosi, “Biotechnologies, Alimentary Fears and the Orthorexic Society” (2006) 2(3) Tailoring Biotechnologies 37–56.
[117] Frederik Zuiderveen Borgesius, Improving Privacy Protection in the Area of Behavioural Targeting (2015) available at https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2654213 (accessed 5 December 2017); Avi Goldfard and Catherine Tucker, “Online Advertising, Behavioral Targeting, and Privacy” (2011) 54(5) Communications of the ACM 25-27.
[118] Frederik Zuiderveen Borgesius et al., “Should We Worry About Filter Bubbles?”(2016) 5(1) Internet Policy Review: Journal on Internet Regulation 1-16; Eli Pariser, The Filter Bubble: What the Internet is Hiding from You (London: Penguin Press, 2011).
[119] GDPR, art. 15(1)(h).
[120] Sandra Wachter, Brent Mittelstadt and Luciano Floridi, “Why a Right to Explanation of Automated Decision-Making Does Not Exist in the General Data Protection Regulation” [2017] International Data Privacy Law 76-99.
[121] The concepts of consent and purpose limitations have been discussed in a plethora of works. These include: Menno Mostert, Annelien Bredenoord, Monique Biesaart and Johannes van Delden, “Big Data in Medical Research and EU Data Protection Law: Challenges to the Consent or Anonymise Approach” (2016) 24 European Journal of Human Genetics 956-960; Beata Safari, “Intangible Privacy Rights: How Europe’s GDPR Will Set a New Global Standard for Personal Data Protection” (2017) 47(3) Seton Hall Law Review 809-848; Tal Zarsky, “Incompatible: The GDPR in the Age of Big Data” (2017) 47(2) Seton Hall Law Review 995-2012.
[122] Sander Voerman, “Health Coaches” in Linda Kool et al (eds) Sincere Support. The Rise of the E-coach, (The Hague: Rathenau Instituut, 2015) p. 41.
[123] Lisa Guerin, “Is Your Employee Wellness Program Legal?” available at http://labor-employment-law.lawyers.com/human-resources-law/wellness-programs-may-be-bad-for-employers-health.html (accessed 5 December 2017); Soeren Mattke et al., Workplace Wellness Programs Study (Final Report, Santa Monica, CA: RAND Corporation, 2013).
[124] Michelle Mello and Meredith Rosenthal, “Wellness Programs and Lifestyle Discrimination – The Legal Limits” (2008) 359 The New England Journal of Medicine 192-199.
[125] Ibid.
[126] Wolf Kirsten, “Making the Link between Health and Productivity at the Workplace —A Global Perspective” (2010) 48 Industrial Health 251-255, p. 254.
[127] See for example “Daimler Speicherte Heimlich Krankendaten” (2009) Der Tagesspiegel available at http://www.tagesspiegel.de/wirtschaft/neuer-datenskandal-daimler-speicherte-heimlich-krankendaten/1497042.html (accessed 5 December 2017).
[128] Marco Guazzi et al., “Worksite Health and Wellness in the European Union” (2014) 56(5) Progress in Cardiovascular Diseases 508-514
[129] Ibid, p. 510.
[130] Supra n. 26.
[131] Ibid.
[132] Paul Schwartz, “Property, Privacy, and Personal Data” (2004) 117(7) Harvard Law Review 2055-2128, p. 2058.
[133] The Economist, “China invents the digital totalitarian state” (2016) The Economist available at https://www.economist.com/news/briefing/21711902-worrying-implications-its-social-credit-project-china-invents-digital-totalitarian (accessed 5 December 2017).
[134] Ibid.
[135] Primavera De Filippi, “Big Data, Big Responsibilities” (2014) 3(1) Internet Policy Review 1-12.
[136] Morgane Remy, “Personal Data: What if Tomorrow Your Insurance Company Controlled Your Lifestyle?” [2016] Multinationals Observatory.
[137] Sourya De and Daniel Le Métayer, “PRIAM: A Privacy Risk Analysis Methodology (Research Report)” [2016] RR-8876, Inria – Research Centre Grenoble, Rhône-Alpes (hal-01302541). For an example, see Tara Siegel Bernard, “Given Out Private Data for Discount in Insurance” [2015] The New York Times available at https://www.nytimes.com/2015/04/08/your-money/giving-out-private-data-for-discount-in-insurance.html (accessed 5 December 2017)
[138] Supra n. 136.
[139] See for example: Rajindra Adhikari, Karen Scott, and Deborah Richards, “Security and Privacy Issues Related to the Use of Mobile Health Apps”, (2014) paper presented at the 25th Australasian Conference on Information Systems mHealth App Privacy and Security Issues 8th-10th Dec 2014, Auckland, New Zealand, available at http://www.colleaga.org/sites/default/files/attachments/acis20140_submission_12.pdf25th (accessed 5 December 2017); Hamed Haddadi, Akram Alomainy and Ian Brown, “Quantified Self and the Privacy Challenge in Wearables” [2014] The IT Law Community; Deborah Lupton, “Quantified Sex: a Critical Analysis of Sexual and Reproductive Self-Tracking Using Apps” (2015) 17(4) Culture, Health & Sexuality 440-453.
[140] Bari Faudree and Mark Ford, “Security and Privacy in Mobile Health” [2013] CIO Journal available at http://deloitte.wsj.com/cio/2013/08/06/security-and-privacy-in-mobile-health/.
[141] Supra n. 132, p. 2055: “Personal information is an important currency in the new millennium”.
[142] “If you are not paying for it, you’re not the customer; you’re the product being sold” – by Andrew Lewis available at https://twitter.com/andlewis/status/24380177712 (accessed 5 December 2017).
[143] Tracey Caldwell, “The Quantified Self: a Threat to Enterprise Security?” (2014) 11 Computer Fraud & Security 16-20, p. 17.
[144] EDPD, art. 6(1)(b).
[145] Michael McCarthy, “Experts Warn on Data Security in Health and Fitness Apps” (2013) BMJ 347.
[146] EDPD, art. 17(1).
[147] Ibid., art. 16.
[148] Ibid., art. 17(1).
[149] Kuan Hon, “Data Security Developments under the General Data Protection Regulation” (2015) LexisNexis, World of IP and IT law.
[150] GDPR, art. 25.
[151] Ibid., art. 32.
[152] Charles Duhigg, “Psst, You in Aisle 5” [2012] New York Times, § 6 (Magazine) available at http://www.nytimes.com/2012/03/04/magazine/reply-all-consumer-behavior.html (accessed 5 December 2017) p. 30.
[153] Charles Duhigg, “How Companies Learn Your Secrets” [2012] New York Times available at http://www.nytimes.com/2012/02/19/magazine/shopping-habits.html (accessed 5 December 2017).
[154] Kate Crawford and Jason Schultz, “Big Data and Due Process: Toward a Framework to Redress Predictive Privacy Harms” (2014) 55(1) Boston College Law Review 93-128, p. 98.
[155] Supra n. 153.
[156] Ibid.
[157] GDPR, art. 6(4).
[158] Nicolas Terry, “Protecting Patient Privacy in the Age of Big Data”, (2012) 81(2) UMKC Law Review 385-415, p. 394.
[159] Ibid.
[160] See for example: Ari Juels, “Targeted Advertising … and Privacy Too” in David Naccache (ed) Topics in Cryptology – CT-RSA 2001, Lecture Notes in Computer Sciences (Berlin: Springer, 2001); Catherine Tucker, “Social Networks, Personalized Advertising, and Privacy Controls” (2014) 51(5) Journal of Marketing Research 546 – 562; Hamed Haddadi et al.,“Targeted Advertising on the Handset: Privacy and Security Challenges” in Jörg Müller, Florian Alt and Daniel Michelis (eds) Pervasive Advertising (London: Springer-Verlag, 2001) 119-137.
[161] Supra n. 154, p. 98.
[162] See for an article going in depth on the topic of consent: Daniel Solove, “Privacy Self-Management and the Consent Dilemma” (2013) 126 Harvard Law Review 1880-1903.
[163] Eve Caudill, Patrick Murphy, “Consumer Online Privacy: Legal and Ethical Issues” (2000) 19(1) Journal of Public Policy & Marketing 7-19.
[164] Simson Garfinkel and Gene Spafford, Web Security, Privacy, and Commerce (Sebastopol: O’Reilly Media, 2002).
[165] Cesare Bartolini et al., “Assessing IT Security Standards Against the Upcoming GDPR for Cloud Systems” [2015] Presentation at Grande Region Security and Reliability Day 2015.
[166] Omer Tene and Jules Polonetsky, “Privacy in the Age of Big Data. A Time for Big Decisions” [2012] Standford Law Review, available at https://www.stanfordlawreview.org/online/privacy-paradox-privacy-and-big-data/ (accessed 5 December 2017).
[167] Paul Ohm, “Broken Promises of Privacy: Responding to the Surprising Failure of Anonymization” (2010) 57 UCLA Law Review, 1701 – 1765; Arvind Narayanan and Vitaly Shmatikov “Robust De-anonymization of Large Sparse Datasets” [2008] Proceedings of IEEE Symposium on Security & Privacy 111-125.