The right to be forgotten forms part of the European Union’s General Data Protection Regulations (GDPR), giving a data subject the right to have their details removed from a system upon request.

Although in principle, the right to be forgotten (or right to erasure) seems to be a fair ask, it's been subject of much controversy, especially with censorship advocates who believe people should not be able to decide what is deleted and what isn't from a public interest point of view.

However, as the legislation has now been made part of the law, it's essentially respected and should a customer, partner or employee request their data to be removed from your database, it should be done so quickly and comprehensively, leaving no evidence of the original information.

 

The right to be forgotten has been discussed in courts across Europe since 2014, when a Spanish citizen requested that details of his house repossession were removed from Google as the data was no longer accurate.

The Spanish Court agreed, as did The EU Court of Justice, and from then on it became generally understood that any inaccurate or out of date information should be allowed to disappear without a trace upon request from the parties involved.

In fact, it became a huge talking point and more and more people started requesting that information about them was removed from public databases, of which Google is probably the biggest. Now, anyone can request that out-of-date, irrelevant, excessive or inaccurate data be removed from Google search results and Google can then decide whether the claim is rightful.

Google has now created a system for taking and assessing complaints, and releases data on how many right to be forgotten requests it's received and how many it's acquiesced to. As of the beginning of 2018, Google said it had received 655,000 requests to de-list 2.5 million links; it took down four in ten. 

 

But the 'right to be forgotten' is not considered in isolation and must be balanced against other rights, such as that of freedom of expression, the court said, with information that's considered to be in the public interest unlikely to be removed on request.

Google, and any other search engines, must consider removing links to any information that is inaccurate, inadequate, irrelevant, or excessive, when a request is filed from an individual about their own search results. With Google, your 'right to be forgotten' can be exercised using this form.

Right to be forgotten: GDPR

The right to be forgotten ruling was based on the EU's 1995 Data Protection Directive, which stated in Article 12 that people can ask for their personal data to be deleted once it's no longer necessary. The ruling outlined when and how search engines like Google must honour such a request.

However, the General Data Protection Regulation (GDPR), which applies to all EU member states (and all organisations using EU citizens' personal data) from 25 May 2018, is soon replacing the 1995 Directive. Intended to update privacy and data protection rules for the digital age, GDPR also updates the definition of the right to be forgotten.

In Article 17, the GDPR legislation considers the right to be forgotten in the context of organisations collecting and processing people's personal data. It retains the 1995 Directive's intent to allow people to request their data is deleted when it's no longer relevant but expands this right to give people more control over who can access and use their personal data.

Under GDPR then, an EU citizen has the right to demand an organisation erases their personal data if:

  • the data is no longer relevant to the reason it was collected;

  • if the person withdraws their consent for their data to be used (and if the organisation has no other legal basis for collecting it);

  • if the person objects to their data being collected for marketing purposes or where their rights override legitimate interests in collecting data (for instance, where that is sensitive data concerning a child);

  • if the data was unlawfully processed;

  • if the data's erasure is necessary to comply with a legal obligation;

  • if the data belongs to a child, and was exchanged for "information society services".

In all these cases, the organisation must delete the data "without undue delay" - i.e. as soon as possible. If the organisation has made the data public, they must take "reasonable steps, including technical measures" to inform any other organisation processing that data that the citizen has asked for it to be removed. 

However, organisations don't have to honour these requests if they're complying with legal obligations, exercising their right to freedom of expression or the right to freedom of information, if the data is in the public interest or to establish, exercise or defend legal claims.

Right to be forgotten: UK

Until the UK officially leaves the EU in March 2019, the right to be forgotten rule, as well as the rest of the principles outlined in GDPR, will apply.

But beyond that, the UK will be seeking to implement the GDPR to UK law in the form of the Data Protection Bill, which includes a right to be forgotten, giving people the right to ask an organisation to erase their personal data.

Data subjects may, for example, ask social media companies to delete posts they published earlier in their lives that would hinder them personally or professionally as adults.

There are two key examples of cases so far where individuals in the UK have invoked their right to be forgotten, in February and March 2018. A claimant known as NT1 on 27 February and another known as NT2 on 13 March, both men and both including challenges against Google, according to reports.

They hope to have Google remove links to articles from their search engine listing previous convictions for crimes committed in a place of work - arguing these reports have damaged their personal relationships, and hindered their professional reputation. NT2's court filing even referenced the assertion he had faced attempted blackmail, and had been threatened in public.

Although the convictions are governed by an updated law that stipulates they don't need to be revealed to employers, Google claims the information is of public interest. At a pre-trial hearing in January 2018, judge Matthew Nicklin said the cases hinge on ruling whether the right to privacy (i.e. the right to be forgotten) or the right to freedom of expression (Google), should be upheld.

Though the date for Justice Warby's decisions is yet to be disclosed, in both instances, the rulings are likely to set a precedent for future cases.

What will be removed?

The information must be deemed “irrelevant, outdated, or otherwise inappropriate,” and be accompanied by a digital copy of the user’s official identification. Failure to remove links that align with the EU court ruling’s definition will result in fines.

Who is regulating the right to be forgotten?

How Google handles complaints and requests to remove information from its search results will be looked over by a taskforce of European privacy watchdogs, referred to as Article 29.

Following the flood of requests received by Google, Professor Luciano Floridi, the person tasked with determining how Google can comply with the recent EU court ruling, said in 2014: “People would be screaming if a powerful company suddenly decided what information could be seen by what people, when and where. That is the consequence of this decision. A private company now has to decide what is in the public interest.”

Google, currently responsible for almost 90% of web searches in Europe, faces the unenviable task of balancing its duty to comply with its users’ “right to be forgotten” and preserving its reputation as the go-to source for online information and content.

Peter Barron, Google’s director of communications for Europe, said: “The European court of justice ruling was not something that we wanted, but it is now the law in Europe, and we are obliged to comply with that law. We are aiming to deal with it as responsibly as possible... It's a very big process, it's a learning process, we are listening to the feedback and we are working our way through that.”

All applications must verify that the links in question relate specifically to the applicant unless the applicant has the legal authority to act on the claimant's behalf, in which case this must be proven. 

Is it about privacy or censorship?

The request form launched by Google after the ruling received 12,000 entries from across Europe within 24 hours – at one point receiving up to 20 requests a minute. This grew to 41,000 requests in the first four days.

Feeding into fears about the potential consequences of this ruling, almost a third of the requests were related to accusations of fraud, while a further 12% were attached to child pornography arrests and 20% for other serious or violent crimes.

Of these first 12,000 entries, around 1,500 were said to be people residing in the UK, with an ex-politician, a paedophile and a GP among them.

By December 2014, the number of requests received by Google has grown to around 175,000 from all 28 EU countries, with 65,000 of the links coming from the UK. As of 22 January 2018, Google had complied with 43.3% of requests to remove links.

Before the form was made available, most removal requests to Google were coming from Germany and Spain, with the UK, Italy, and France making up the rest of the top five.

There are concerns from many that the ability for users to request information be removed from search results could result in the system being abused for nefarious purposes.

However, lawyers have assured those worried that politicians, celebrities, and criminals will probably not benefit from the ruling as Google will have the right to reject applications that request removal of information deemed in the public interest.

It should also be noted that, while links to the objectionable information will be removed, the information will not actually be deleted from the web.

Following on from comments regarding the ruling, Baroness Prashar, chair of the Lords Home Affairs EU Sub-Committee, said: “[We] do not believe that individuals should have the right to have links to accurate and lawfully available information about them removed, simply because they do not like what is said.”