Mar 19, 2018 — pdf London School of Economic, Media Policy Brief 19 ‘The New Political Campaigning’ p.6. March 2017. 75 Recital 2, GDPR. 76 The analysis
31 pages

291 KB – 31 Pages

PAGE – 2 ============
2 | Page The European Data Protection Supervisor (EDPS) is an independent institution of the EU, in particular their right to privacy, a Commission is required the EDPS. He was appointed in December 2014 together with the Assistant Supervisor with th e specific remit of being constructive and proactive. The EDPS published in March 2015 a five – year strategy setting out how he intends to implement this remit, and to be accountable for doing so.

PAGE – 3 ============
3 | Page Executive Summary The digitisation of society and the economy is having a mixed impact on civic engagement in decision – making and on the barriers to public involvement in democratic processes. Big data analytics and artificial intelligence systems have made it possible to gather, combine, analyse and indefin itely store massive volumes of data. Over the past two decades, a dominant business model for most web – based services has emerged which relies on tracking people online and gathering data on their character, health, relationships and thoughts and opinions with a view to generating digital advertising revenue. These digital markets have become concentrated around a few companies that act as effective gatekeepers to the internet and command higher inflation – adjusted market capitalisation values than any compa nies in recorded history. This digital ecosystem has connected people across the world with over 50% of the population on the internet, albeit very unevenly in terms of geography, wealth and gender. The i nitial optimism about the potential of internet tool and social media for civic engagement has given way to concern that people are being manipulated, first through the constant harvesting of often intimate in formation about them, second through the control over the information they see onli ne according to the category they are put into . Viral outrage for many algorithm – driven services is a key driver of value, with products and applications that are designed to maximise attention and addiction. Connectedness, at least under the current model , has lead to division. The ensuing debate has revolved around the misleading, false or scurrilous information focused on transparency measures, exposing the source of information while neglecting the accountability of players in the ecosystem who profit from harmful behaviour. Meanwhile market concentration and the rise of platform dominance present a new threat to media plurali sm . For the EDPS, this crisis of confidence in the digital ecosystem illustrates the mutual dependency of privacy and freedom of expression. The diminution of intimate space available to people, as a result of unavoidable surveillance by companies and governments, has a chilling including in the civic sphere so essential to the health of democracy. This Opin ion is therefore concerned with the way personal information is used in order to micro – target individuals and groups with specific content , the fundamental rights and values at stake, and relevant laws for mitigating the threats . The EDPS has for several years argued for greater collaboration between data protection authorities and other regulators to safeguard the rights and interests of individuals in the digital society, the reason we launched in 2017 the Digital Clearinghouse. Given concerns that polit ical campaigns may be exploiting digital space in order to circumvent existing laws, 1 we believe that it is now time for this collaboration to be extended to electoral and audio – visual regulators.

PAGE – 4 ============
4 | Page TABLE OF CONTENTS 1. Why are we publishing this Opinion 5 I. INTENSE ONGOING PUBL IC DEBATE 5 II. RELEVANCE OF DATA PR OTECTION LAW AND POL ITICAL CAMPAIGNS 5 III. THE PURPOSE OF THIS EDPS OPINION 6 2. How personal data is used to determine the online experience 7 I. DATA COLLECTION 7 II. PROFILING 8 III. MICROTARGET ING AND MANIPULATION 9 3. The Digital (mis)information ecosystem 9 I. PLATFORM INTERMEDIAR IES AT THE CENTRE OF DIGITAL ADVERTISING 10 II. NON – COMMERCIAL ADVERTISE RS 11 III. ARTIFICIAL INTELLIGE NCE 12 4. Fundamental rights and values at stake 12 I. DATA PROTECTION AND OTHER FREEDOMS 12 II. MEDIA PLURALISM 13 III. FRE E ELECTIONS 13 5. Relevant legal frameworks 13 I. DATA PROTECTION RULE S AND PRINCIPLES 13 Scope .. 14 Controllers and accountability .. .. .. .. 14 Purpose limitation .. .. .. 15 II. AUDIO – VISUAL MEDIA RULES 16 III. ELECTORAL REGULATION S 16 IV . CONSUMER PROTECTION 17 V. COMPETITION LAW 17 6. Recommendations 18 I. COMPLETE AND ENFORCE DATA PROTECTION RULE S 18 II. REGULATORS SHOULD AI M FOR A COLLECTIVE D IAGNOSIS OF THE PROB LEM 18 III. REGULATORS SHOULD CO OPERATE ACROSS SECTORS 19 IV. SELF – REGULATION AND CODES OF CONDUCT SHOULD BE ENCOURAGED 20 V. EMPOWER INDIVIDUALS TO EXERCISE THEIR RI GHTS INCLUDING COLLE CTIVE ACTION 20 7. Conclusion 22

PAGE – 5 ============
5 | Page 1. Why are we publishing this Opinion i. Intense ongoing public debate There is currently an intense public debate about vast and complex ecosystem of digital information on not only the market economy but also on the political economy , how the political environment interacts with the economy . The major platforms sit at the centre of this ecosystem, gaining disproportionately from the growth in digital advertising, and are increasing their relati v e power as it evolves. Personal data is needed to segment, to target and to customise messages served to individuals, but most advertisers are unaware of how such decisions are taken and most individuals are unaware of how they are being used. The system rewards sensational and viral content and does not in general distinguish between advertisers, whether commercial or political. Revelations of how deliberate that the integrity of democracies may be under threat. Artificial Intelligence systems – the market for which is also characterised by concentration are themselves powered by data and will – if unchecked – increase the remoteness and una ccountability of the decision – making in this environment . ii. Relevance of data protection law and political campaigns The fundamental rights to privacy and to data protection are clearly a cru cial factor in remedying this situation, which makes this issue a strategic priority for all independent data protection authorities. In their 2005 Resolution on the Use of Personal Data for Political Communication, data protection regulators articulated w orldwide key data protection concerns related to the increased processing of personal data by non – commercial actors. It referred sensitive data related to real or supposed moral and political convictions or activities, or to voting acti are currently classified – sometimes inaccurately or on the basis of a superficial contact – as 2 . The international R esolution called for da ta protection rules on data minimisation, lawful processing, consent, tra nsparency, data subjects rights, purpose limitation and data security to be more rigorously enforced. It may now be time for this call to be renewed. EU law on data protection and co nfidentiality of electronic communications apply to data collection, profiling and microtargeting, and if correctly enforced should help minimise harm from attempts to manipulate individuals and groups. Political parties processing voter data in the EU fal l within the scope of the GDPR. The GDPR defines personal data revealing political opinions as special categories of data. Processing such data is generally prohibited unless one of the enumerated exemptions applies. In the context of political campaignin g, the following two exemptions are particularly relevant and merit full citation : (d) processing is carried out in the course of its legitimate activities with appropriate safeguards by a foundation, association or any other not – for – profit body with a po litical, philosophical, religious or trade union aim and on condition that the processing relates solely to the members or to former members of the body or to persons who have regular contact with it in connection with its purposes and that the personal da ta are not disclosed outside that body without the consent of the data subjects;

PAGE – 6 ============
6 | Page (e) processing relates to personal data which are manifestly made public by the data subject; [] . (g) processing is necessary for reasons of substantial public interest, on the basis of Union or Member State law which shall be proportionate to the aim pursued, respect the essence of the right to data protection and provide for suitable and specific measures to safeguard the fundamental rights and the interests of the data subject . Reci [w]here in the course of electoral activities, the operation of the democratic system in a Member State requires that political parties compile personal data on people’s political opinions, the processing of suc h data may be permitted for reasons of public interest, provided that appropri . Several data protection authorities have developed rules or guidelines on data processing for political purposes: In March 2014, the Italian Data Protection Authority adopted rules on processing of personal data by political parties. The rules highlighted the general prohibition to use personal data made public on the Internet, such as on social networks or forums, for the purposes of political com munication, if this data was collected for other purposes 3 . In November 2016, the French National Data Protection Commission (CNIL) provided additional guidelines to its 2012 recommendations on political communication, specifying the rules for processing of personal data on social networks. In particular, CNIL underlined that aggregation of personal data of voters in order to profile and target them on social networks can only be lawful if based on the consent as a ground for data processing 4 . In April 2017, issued updated Guidance on political campaigning , which also included guidelines on the use of data analytics in political campaigning. ICO explained that when a political organization commissio ns a third party company to carry out analytics, then that company is likely to be a data processor, whereas the organization a controller. Specific provisions of the data protection law governing controller – processor relationship have to be accounted fo r, in order for the processing to be lawful 5 . The guidelines of the national data protection authorities have a potential of providing additional authoritative interpretation of data protection and privacy law provisions, which account for the differences in the organisation of national political systems 6 . iii. The purpose of this EDPS Opinion The EDPS vision is to help the EU lead by example in the global dialogue on data protection and privacy in the digital age by identifying cross – disciplinary policy soluti ons to the Big Data challenges and developing an ethical dimension to processing of personal i nformation 7 . We and highlighted ethical issues around the effec ts of predictive profiling and algorithm – determined personalisation 8 . We have called for responsible and sustainable development of the digital society based on individual control over personal data concerning them, privacy – conscious engineering and accoun ta bility and coherent enforcement 9 . The EDPS Ethics microtargeting of electoral canvassing

PAGE – 8 ============
8 | Page provided by the individuals, like by filling in an online form. Most data however is observed or recorded automatically, described as deposited unwittingly as a result online and offline activities 22 . Suc h observed data include the times and locations when mobile devices connect with mobile telephone towers or GPS satellites, IP addresses of the terminals, WiFi accesses points, browsing history , images co llected by digital CCTV system s, purchase history, social media engagement and browsing behaviour across devices 23 . According to a recent study people are much more likely to and trolls, includ ing those acting on behalf of hostile third states contribute to this further dissemination 24 . A significant category is the data collected from people who take online psychological quizzes which often achieve viral popularity when accessed and shared over social media enables intricate personality prediction 25 . Companies use tracking technologies to collect observed data, typically cookies as well as flash cookies, web b eacons, device fingerprinting which can track across different devices 26 . Meanwhile the proliferation of connected things and listening devices installed in the home such as smart speakers (the market for which is also already characterised by concentration ) presents new possibilities to observe real – behaviour 27 . When messages and content targeted at an individual based on profiling elicits a reaction from that individual, the reaction is in turn monitored, which creat es additi onal data for collection and use to refine the profile and future targeting. ii. Profiling Collected personal data is examined to segment people according to precise profiles . There exists a myriad of traits which can be measured and which can be used to infer user preferences from a user profile, such as age, gender, location and so on 28 . The major social media pro v ider is estimated to have used over 52 and attributes . Statistical methods are then used to generate analytical information or to predict f uture behaviours or development 29 . Automated profiling identifies patterns that are invisible to the human eye 30 . The more user data is available about a person, and the longer a user can be profiled, the ri cher become the inferences 31 . More advanced profiling practices allow scoring or assessing people against benchmarks of predefined patterns of normal behaviour. An example of such applications is a hiring softw are 32 . Another example is how typing patterns on a computer sness, sadness, and emotional state can be predicted from seemingly non – sensitive information, such as his keystroke dynamics 33 . Big Data combined with behavio ural science enables inferences about even deeper personality portraits. Some data analytics companies specialise in assessing individuals based on five personality traits known a personality tests (see above), a technique reported to have been exploited by campaigners during 2016 US Presidential elections and UK Brexit referendum 34 . These assessments are then supplemented with additional characteristics, including values and needs, likes and shares 35 . Profiling serves also to identify other people who might be potential ly interested in a product the major social media platforms 36 .

PAGE – 9 ============
9 | Page The quality of the new knowledge created as an outcome of profiling is subject to debate. Certain studies show that data mining techniques can predict personality more accurately than most of their friends and family 37 . Others consider profiling as situational and inherently probabilistic 38 . In any case, the created knowledge is further used to make decisions (automated or not) about a person or a group of people. iii. Microtargeting and manipulation Decisions based on profiling personali informational environment with a high degree of personali s ation, a practices referred to as microtargeting 39 . It may consist in a more personal message to a segment of people sharing certain traits or even potentially determine the prices for products or services . It may consist in how s ocial media platforms determine which content that appears on individual news feeds and in what order. Companies in the business of selling digital ad space profit from the placing of targeted content irre spective of any ethical considerations : there is no distinction made between a good or bad click from a target demographic 40 . These microtargeting activities may have little effect on some individuals , but the complexity of the technology, low levels of tru st and the avowed intentions of several important tech player s point towards a culture of manipulation in the online environment 41 . This m anipulation may occur as a result of the business strategies chosen by market players themselves , or because of the act ions of individuals and states seeking to use platforms intermediaries to disrupt or subvert markets and public discourse. Moreover , the intention behind the design of devices and software has been to induce addictive behaviour. Features like auto – play, endless newsfeeds, reciprocation of messages or image sharing) are , according to a number of former emp loyees in the tech industry, deliberate attempts to maximise attention through microtargeting towards users, especi ally children, similar to the techniques used by the gambling industry 42 . Web – 43 . Manipulation also t ake s the form o f microtargeted, managed content display which is presented as being most relevant for the individual but which is determined in order to maximise revenue for the platform. sites and the perspective (such as declining to add additional items, like insurance, to a shopping cart). The major platforms admitted in 2017 that over 125 million inviduals in the United S tates had been reached by content ads and messages from fake accounts . Further reports released just before the publication of this Opinion have alleged a far more widespread degree of intrusion, although the precise effects on actual voting b ehaviour remain unknown 44 . A more significant and chronic form of manipulation may however be the chilling effect upon freedom of expression which results from the constant surveillance which characterises the digital ecosystem 45 . 3. The Digital (m is)informati on e cosystem Manipulation and misinformation are as old as humankind but with rapid digitisation they have become matters of pressing social, legal and ethical importance. It had been hoped and expected that new forms of civic engagement would flourish as more people connected to the

PAGE – 10 ============
10 | Page internet through on l ine campaigns, crowdsourcing and caused – ba sed communities on social media 46 . Currently however the sustainability of microtargeting is subject to heated debates 47 . Ma nipulation by means of m icrotargeting presupposes the existence and access to the databases with a variety o f data points about individuals , and intellectual property solutions in the form of analytical algorithms that can draw inferences and p redictions a bout p e ople using these data. It is a multi – layered process where two groups of actors interact : The a dvertising eco system which rel ies on the collection and analysis of personal d ata as the prevailing business model. Non – commercial advertisers. A third big player is emerging in Artificial Intelligence which further blurs the lines of accountability. This complex broad digital eco system, composed of businesses and organisations which may have been regulated in the past by different areas of law (consumer law, electoral law, media law, competition law, etc.), makes it more challenging to assign legal responsibility to each of them, to enforce existing rules and to e nsur e that individuals have recourse to an effective remedy where abuses occur . i. Platform intermediaries at the centre of digital advertising A very small number of giant companies have emerged as effective gatekeepers of the digital content which most people consume. They occupy a commanding position among a variety of other acto r s including a dvertising businesses, data brokers and data analytics companies. In the 2015 EU citizenship consultation, more than seven out of 10 respondents (72%) said they use internet platforms as a source of information. In Europe, currently more than a third of ad vertising spend is spent on digital channels surpassing TV advertising (although there are significant difference s between regions). In the UK, one of the more advanced digital markets, more than 50% of every advertising pound spent goes to online channels 48 . Newspapers (63%) and TV (62%) were the second and third most popular sourc es of information on EU matters 49 . M ost search traffic has migrated to smartphones where the biggest company has 97% market share. Advertisers who use one of the two major platform grounds that they are reported to account for between 80% and 99% of all revenue growth from digital advertising, cannot control where their advert is placed. Opaque algorithms have placed such ads on sites displaying racist, incendiary or scurrilous content, which has led to a number of large advertisers withdrawing from programmatic ad marketplaces, where software is used to buy and sell advertising 50 . In many countries, one of the two biggest tech companies has become the o nly gateway t o the internet 51 . There is less capital investment in start – up s (down 40% since 2015) indicating that investors see less scope for disruption in the concentrated market 52 . Data analytics could help individuals navigate through the increasingly noisy information environment. However, the reality has been to tip the balance of benefits away from individual, deepening informational asymmetry in favour of owners of proprietary algorithms. By limiting expos ure to certain information, for instance in job advertisement s gender or inferred health status, they may further perpetuate discrim inatory attitudes and practices 53 . In effect, the forum for public discourse and the available space for freedom of speech is now bounded by the profit motives of powerful private companies who, due to technical complexity or on the grounds of commercial secrecy, decline to explain how

PAGE – 11 ============
11 | Page deci sions are made. The few major platforms with their extraordinary reach therefore offer an easy target for people seeking to use the system for malicious ends. ii. Non – commercial advertisers Advertisers are not limited to commercial pl ayers seeking customer ins ights 54 . Governments, p olitical and ideological movements, political parties, campaigns, political candidates , and other cause – driven organis ation s have always sought to spread their message, rally volunteers, recruit donors and otherwise influence public o pinion and build communities both online and – or promote a commercial product or service, but rather to communicate their message in order to influence politica l, social or other views of the individuals and to encourage – or discourage – support for a cause or to vote in an election 55 56 . Until recently non – commercial advertisers had access to only limited data about their constituency. Now they have begun to exploit the same targeted internet advertising system used by commercial entities by mining the reactions and discussions on social media in real time and to aggregate data and personality traits a nd likely voting behaviour of the electorate. Many governmental institutions, political and other interest groups have dedicated websites, which to a larger or to a lesser extent use tracking technologies discussed above. They also have an active presence on the social media and make use of targeting (advertising) tools offered by the online businesses 57 . Non – nd publishing tools. Fan page administrators can obtain viewing statistics, and choose audiences among fan page followers and among all platform users on the basis of demographics, interest, behaviour or other criteria in order to better personalize the p latform messages. They can then customise messages to be served back to audiences according to profile and location 58 . How these tools are used varies between countries and types of organisation 59 . In any case t here is thus a blurring of the lines registration and party affiliation, data analysts now process any information revealing personality traits. Political campaigns are increasingly relying on Big Data an alytics to influence opinions and alleged aim is to target people with misleading information 60 . The ability of AI and Big Data to influence significantly democratic pro cesses, certainly outside the United States, is contested. Available empirical evidence from political campaign practices in the Netherlands and Germany show little engagement with microtargeting practices due to practical limitations, which include lack o f expertise, funds, particularities of the local jurisdiction, or the legal framework itself 61 . On the other hand t he ongoing investigation by the UK Information Commissioner, with a parallel investigation conducted by the Electoral Commission, into alleged data protection abuses in campaigning during the Brexit referendum is scrutinising the activities of 30 organisations including political parties and campaigns, data compa nies and social media platforms 62 . Regardless of their effectiveness t here is a clear interest from non – commercial advertisers to explore the targeting techniques initially developed for the commercial sector 63 .

291 KB – 31 Pages