Europe’s digital rights debate nearing the finishing line

Europe is very close to the finishing line of an extraordinary project: the adoption of the new General Data Protection Regulation (GDPR), a single, comprehensive replacement for the 28 different laws that implement Europe’s existing 1995 Data Protection Directive.

The data protection reform package consists of two draft laws: a general regulation covering the bulk of personal data processing in the EU and a directive on processing data to prevent, investigate, detect or prosecute criminal offences or enforce criminal penalties.

Green MEP Jan Philipp Albrecht (Denmark) is the lead MEP for the draft regulation that is updating the principles set out in a 1995 directive, with the changes in data processing brought about by the internet: such as internet data on social networks, online shopping and e-banking services, and offline databases held by hospitals, schools, businesses and research companies.

Parliament, the Council and the Commission are now meeting in three-way talks to seek a final agreement on the data protection regulation before the end of 2015. The EP’s aim is to reach agreement on both the regulation and the directive before the end of 2015.

The wide array of proposals in the regulation truly captures the spectrum of people’s lives as they manifest themselves on the internet.

Right to be forgotten (Article 17)

MEPs say people should have the right to have their personal data erased if data processing does not comply with EU rules; the data are no longer necessary for the purposes for which they were collected; or the person objects or withdraws his/her consent for the processing of his/her personal data.

To enforce this right, a person that asks an internet company to erase their data will make that company forward the request to any others that replicate the data.

There will be restrictions, such as if the data is needed for historical, statistical and scientific research purposes, for public health reasons or to exercise the right to freedom of expression. And the right will not apply when the retention of personal data is necessary to fulfil a contract or is required by law.

The ‘right to be forgotten’ as expressed in Article 17 of the Commission’s draft of the Regulation is the “right to obtain from the data controller the erasure of personal data relating to them and the abstention from further dissemination of such data, especially in relation to personal data which are made available by the data subject while he or she was a child”, where one of four grounds applies.

What is contentious is the extent to which these data controllers or search engines like Google be burdened with the responsibility for the removal of content. The Commission has proposed that they “inform third parties which are processing such data, that a data subject requests them to erase (…) personal data” so that third parties such as publishers can have a say when removal requests are submitted to search engines in respect of their content.

In contrast the EP’s amended text goes even further than this, suggesting that the data controller “shall take all reasonable steps to have the data erased, including by third parties”.

This means that data controllers will be judge and jury when considering such requests.

Explicit consent (Article 7)

How many times have we bothered to read the terms and conditions that come with every app we download on our mobile phones or every software we install from the Internet?

Well, the European Parliament says that where processing is based on consent, a company should process personal data only after obtaining clear permission from the data subject, and that people should be able to withdraw consent anytime they demand it.

Now that means “any freely given, specific, informed and explicit indication of his/her wishes, either by a statement or by a clear affirmative action”. MEPs said the execution of a contract or the provision of a service cannot be made conditional upon consent to processing personal data that is not strictly needed for the completion of that contract or service.

The EP also says that the consent should lose its effect as soon as the processing of personal data is no longer needed for the initial purpose for which it was collected. 

Profiling (Article 20)

Ever felt Google was following your every mouse click? Ever seen adverts appearing on the next website you visit after having searched for exactly the same thing a few moments ago?

It’s a strategy that drives advertising revenues for the search engine that usually happens through “profiling”, a technique that analyses or predicts a person’s performance at work, economic situation, location, health, preferences, reliability or behaviour based on the automated processing of their personal data.

Under the changes proposed by MEPs, profiling, as a general rule, would only be allowed with the consent of the person concerned, where permitted by law or when needed to pursue a contract. 

Parliament also makes clear that profiling should not be based solely on automated processing and should comprise human assessment, including an explanation of the decision reached after such an assessment. 

Data portability

Under the Commission proposal, any person would have the right to ask an email service provider or a social network to provide a copy of all their data in an electronic, commonly used format, to be transferred to another provider or service.

MEPs also propose merging the right to data portability with the right to data access (Article 15) and stress that, for personal information processed by electronic means, the controller should provide a copy of these data “in an electronic and interoperable format”.  This would allow users to switch email providers without losing contacts or previous emails, for instance. Where technically feasible and at the request of the data subject, the data would be transferred directly from controller to controller (e.g. from email provider to email provider).

Is the ‘right to be forgotten’ bad news for our own rights?

The central area of debate on the details of Europe’s data protection package of reforms is the tug-of-war between advocates for data protection, against companies and government authorities who want to overcome privacy laws.

Critics of the GDPR will certainly say that in the MEPs’ determination to protect the personal information of users online, the ‘right to be forgotten’ seems to have been taken on wholesale, without stopping to consider the rights of publishers exercising their own human right of free expression as well as the right of their audiences to receive such information. 

Even for newspapers, this is a time-bomb that could mean a lot of online reporting being removed and censored by a non-stop stream of demands from people and those in power, to remove online content.

The right of erasure has already become a reality with Google providing the service of removing search engine links. In fact the first sign of how this might pose a problem for free speech online came from the 2014 judgment of the European Court of Justice, Google Spain v. Mario Costeja González—the so-called Right to Be Forgotten case. The decision created an ambiguous responsibility upon search engines to censor the internet, even it was truthful information that had been lawfully published.

Critics said that the problem with the GDPR is that it doubles down on Google Spain.

Here’s how the Electronic Frontier Foundation’s senior policy analyst Jeremy Malcolm puts it.

Firstly, search engines and other internet intermediaries must block the content immediately without notice to the user who uploaded that content – this is something different from the current process, which gives search engines some time to consider the legitimacy of the request. 

“After reviewing the (also vague) criteria that balance the privacy claim with other legitimate interests and public interest considerations such as freedom of expression, and possibly consulting with the user who uploaded the content if doubt remains, the intermediary either permanently erases the content (which, for search engines, means removing their link to it) or reinstates it.”

And if it does erase the information, it is not required to notify the uploading user of having done so, but is required to notify any downstream publishers or recipients of the same content, and disclose any information that it has about the uploading user to the person who requested its removal.

“Think about that for a moment. You place a comment on a website which mentions a few (truthful) facts about another person. Under the GDPR, that person can now demand the instant removal of your comment from the host of the website, while that host determines whether it might be okay to still publish it,” Malcolm says.

“If the host’s decision goes against you (and you won’t always be notified, so good luck spotting the pre-emptive deletion in time to plead your case to Google or Facebook or your ISP), your comment will be erased. If that comment was syndicated, by RSS or some other mechanism, your deleting host is now obliged to let anyone else know that they should also remove the content.

“Finally, according to the existing language, while the host is dissuaded from telling you about any of this procedure, they are compelled to hand over personal information about you to the original complainant. So this part of EU’s data protection law would actually release personal information!”

Then there are the penalties. If the host fails to remove content that a data protection authority later determines should have been removed in the first place, it may become liable to penalties of €100 million or up to 5% of its global turnover.

No doubt: it means the intermediary will take information down if there is even a remote possibility that the information has indeed become “irrelevant”.

“These GDPR removals won’t just be used for the limited purpose they were intended for. Instead, it will be abused to censor authors and invade the privacy of speakers. A GDPR without fixes will damage the reputation of data protection law as effectively as the Digital Millennium Copyright Act permanently tarnished the intent and purpose of copyright law,” says Malcolm.

Critics will also charge that the right to be forgotten goes contrary to implied rights in the Manila Principles on Intermediary Liability, because they impose an obligation on the intermediary to remove content prior to any order by an independent and impartial judicial authority.

The GDPR does not even set out detailed minimum requirements for requests for erasure of content.

And with the lack of due process for the user who uploaded the content, this could set the stage for mistaken over-blocking, with little transparency or accountability built into this process.

Responding to the online threat of terrorism • The Dati Report

In the wake of the Paris attacks, nothing could be more contentious than the controversial – although arguably important – political statement by the European Parliament on radicalisation and recruitment of EU citizens by terrorist organisations.

An estimated 5,000 European citizens have joined terrorist organisations fighting in Iraq and Syria.

With the issue of foreign fighters posing challenges for governments across the EU, a report on preventing the radicalisation and recruitment of Europeans by terrorist organisations was adopted by the civil liberties committee on 19 October.

MEP Rachida Dati
MEP Rachida Dati

Rapporteur Rachida Dati (EPP, France) says that there are hotbeds of radicalised Europeans across the EU, and due to the Schengen area EU citizens may travel freely.

“Over the past two years, terrorist attacks carried out by radicalised European citizens have taken place in a number of countries… we face a threat which impacts on all of us, and this is why a truly European response is needed. This need not mean fewer competences for member states but simply more coordination and collaboration.”

Dati’s report calls on the EU to push its own counter-arguments against those of online terrorists, prosecute internet giants to push them to delete illegal content, segregate radicalised inmates in prisons, engage in dialogue with the various religious communities, prevent radicalisation through education and tackling terrorism-funding channels by guaranteeing greater transparency on external financial flows.

The internet is in fact one of the primary channels of radicalisation. “The biggest problem with the internet is the publishing and proliferation of illegal content. The internet giants must accept their responsibility. Should they refuse to cooperate or show themselves willing, I propose that they face criminal charges,” Dati says.

EDRI (European Digital Rights) – an association of civil and human rights organisations from across Europe who monitor data retention, copyright, crybercrime and net neutrality issues – says MEPs should however fix certain areas related to the role of ISPs and their responsibility when dealing with illegal content, companies’ involvement in counter-terrorist narratives, and the EU Passenger Name Records (PNR) Directive.

Currently, hosting providers must act expeditiously to delete or disable access to illegal content online. However, the current Draft Resolution asks for companies to be held criminally liable if they fail to act, even if only in response to an “administrative request” and not a court order.

EDRI says that there is no evidence to suggest that there are companies active in Europe whose failure to act to address illegal terrorist content could be considered a criminal act in its own right. “The criminalisation of internet companies to address a problem that does not exist brings no benefits, but supporting this measure, even in principle, sets a terrible international precedent.”

EDRI also says it is dangerous that the draft resolution calls on member states to manipulate online speech as a mechanism for preventing radicalisation and hate speech. “This risks being counterproductive, contrary to democracy, and its values and has worrying echoes of state-sponsored ‘troll armies’ in jurisdictions with weak democratic values,” EDRI said.

 But the success of the Dati resolution, as EDRI suggests, “will be sending a clear and legally tenable message: Europe will remain united and will not renounce democracy or citizens’ fundamental rights and freedoms.”

A crucial ‘battleground’ in European security

Maltese MEPs Therese Comodini Cachia and Roberta Metsola (EPP) (pictured) have expressed their resolve with respect to the need to uphold individual rights, in a debate organised by the European Parliament Office in Malta.

“The challenge for the EU is to find the right balance between ensuring privacy online and security of citizens; to ensure that social media are not used as a platform for hate propaganda and for terrorists to communicate online,” Metsola said.

“We need to target the criminals without infringing the rights of the vast majority of citizens unnecessarily.” 

Comodini Cachia echoed this sentiment, stressing that “when as a consequence of terrorism we are forced to give up some of our rights, it is definitely worrying.”

“Our ancestors worked hard to achieve the rights we enjoy today and the past shows us where the balance lies, especially through human rights case law.” 

Referring to the requirement that all data is to be treated without any discrimination or interference, Comodini Cachia said that the discussion was now moving from ‘net neutrality’ to ‘zero rating’, which means ensuring free access to certain services. “The internet is to be considered a common good that is available to all – from big cities to rural areas.” 

Metsola also stressed the importance of digital awareness: “What was termed as ‘the right to be forgotten’ would apply only within the EU territory. We do not have a guarantee that it would be deleted elsewhere in the world. Education and awareness on how to navigate the online environment remain key.”