Law, Justice and Journalism

Posts Tagged ‘data protection’

Google Spain and the EU’s data protection Directive

In Law, Media regulation, Research, Uncategorized on May 13, 2014 at 3:35 pm

By Professor Steve Peers

The EU’s data protection Directive was adopted in 1995, when the Internet was in its infancy, and most or all Internet household names did not exist. In particular, the first version of the code for Google search engines was first written the following year, and the company was officially founded in September 1998 – shortly before Member States’ deadline to implement the Directive.

 Yet, pending the completion of negotiations for a controversial revision of the Directive proposed by the Commission, this legislation remains applicable to the Internet as it has developed since. Many years of controversy as to whether (and if so, how) the Directive applies to key elements of the Web, such as social networks, search engines and cookies have culminated today in the CJEU’s judgment in GoogleSpain, which concerns search engines.

 The background to the case, as further explained by Lorna Woods, concerns a Spanish citizen who no longer wanted an old newspaper report on his financial history (concerning social security debts) to be available via Google. Of course, the mere fact that he has brought this legal challenge likely means that that the details of his financial history will become known even more widely – much as many thousands of EU law students have memorised the name of Mr. Stauder, who similarly brought a legal challenge with a view to keeping his financial difficulties private, resulting in the first CJEU judgment on the role of human rights in EU law.

 

The Court’s judgment

The CJEU addressed four key issues in its judgment: (a) the material scope of the Directive, ie whether it applies to search engines; (b) the territorial scope of the Directive, ie whether it applies to Google Spain, given that the parent company is based in Silicon Valley; (c) the responsibility of search engine operators; and (d) the concept of the ‘right to be forgotten’, ie the right of an individual to insist (in this case) that his or her history be removed from accessibility via a search engine. The details of the Court’s ruling have been summarised by Lorna Woods, but I will repeat some key points here in order to put the following analysis into context.

 

Material scope

Does the Directive apply to search engines? The CJEU said yes.  The information at issue was undoubtedly ‘personal data’, and placing it on a website was ‘processing’. A search engine was processing personal data, even though it originated from third parties, because (using the definition in the Directive) it ‘collects’ data from the Internet, then ‘retrieves’, ‘stores’ and ‘discloses’ it. It was irrelevant that the material had been published elsewhere and not altered by Google, as the CJEU had already ruled in the Satamedia case (in the context of tax information published on CD-ROM). Moreover the definition of ‘processing’ does not require that the data be altered.

A second – and perhaps more important point – was whether Google was a ‘controller’ of the data, with the result that it has liability for the data processing.  Again the key issue was Google’s use of data already published elsewhere. The Advocate-General had concluded from this that Google was not a data controller – but the CJEU reached the opposite conclusion. On this point, the Court, ruling that there must be a ‘broad definition of the concept’ of a ‘controller’, distinguished between the original publication of the data and its processing by a search engine: Google undoubtedly controlled the latter activity, by means of its control over the search process. One is unavoidably reminded of the Machiavellian search-engine billionaire who frequently appears on episodes of The Good Wife – although of course he is nothing like the executives of Google.

In particular, the Court ruled that the activities of search engines make information available to people who would not have found it on the original web page, and provides a ‘detailed profile of the data subject’, and so have a much greater impact on the right to privacy than the original website publication.

 

Territorial scope

Does the Directive apply to search engine companies based in California, with a subsidiary in Spain? The national court suggested three grounds on which this might be the case: the ‘establishment’ in the territory; the ‘use of equipment’ in the territory (as regards crawlers or robots, the possible storage of data and the use of domain names); or the default application of the EU Charter of Fundamental Rights.

The Court found that Google Spain was ‘established’ in the territory, and therefore the data protection Directive, in the form implemented by Spain, applied. It was not necessary to rule on the other possibilities as regards the scope of the Directive, which are very significant in the context of the Internet, so those issues remain open. It should be noted, however, that in light of the objectives of the Directive, the rules on its scope ‘cannot be interpreted restrictively’, and that it had ‘a particularly broad territorial scope’.

Why was Google Spain established there, even though it did not carry out any search engine activities? The CJEU said that it was sufficient that the company carried out advertising activities, these being linked to the well-known business model of Google (selling advertising which was relevant to search engine results).

 

Responsibility of search engine operators

The CJEU ruled that search engine operators are responsible, distinct from the original web page publishers, for removing information on data subjects from search engine results, even where the publication on the original pages might be lawful. It confirmed that the right to demand rectification, erasure or blocking of data did not apply only where the data was inaccurate or inaccurate, but also where the processing was unlawful for any other reason, including non-compliance with any other ground in the Directive relating to data quality or criteria for data processing, or in the context of the right to object to data processing on ‘compelling legitimate grounds’.

This meant that data subjects could request that search engines delete personal data from their search results, and complain to the courts or data protection supervisory authorities if they refused.  As for Article 7(f) of the Directive, which provides that one ground for processing data (where there was no contract, legal obligation, public interest requirement or consent by the data subject) was the ‘legitimate interests of the controller’, this was a case where (as Article 7(f) provides) those interests were ‘overridden’ by the rights of the data subject.

There has to be a balancing of rights in such cases – including the public right to freedom of expression – but in light of the ease of obtaining information on data subjects, and the ‘ubiquitous’ nature of the ‘detailed profile’ that results from search engine results, the huge impact on the right to privacy ‘cannot be justified by merely the economic interest’ of the search engine operator. The public interest in the information was only relevant where the data subject played a role in public life.

In light of the greater impact of search engine results on the right to privacy, search engines are not only subject to a separate application of the balancing test, but a more stringent application of that test – meaning that the information might remain available on the original website, even if it was blocked from the search engine results. The CJEU states that search engines cannot rely on the ‘journalistic’ exception from the Directive.

 

The ‘right to be forgotten’

Finally, the CJEU accepts the arguments that the Directive’s requirements that personal data must be retained for limited periods, only for as long as it is relevant, amounts to a form of ‘right to be forgotten’ (although the Court does not say that such a right exists as such). While it leaves it to the national court to apply such a right to the facts of this case, the Court clearly guides the national court to the conclusion that the data subject’s rights have been violated.

 

Comments

The essential problem with this judgment is that the CJEU concerns itself so much with enforcing the right to privacy, that it forgot that other rights are also applicable.

As regards the right to privacy, the Court’s analysis is convincing. Of course, information on a named person’s financial affairs is ‘personal data’, and it has long been established that prior publication is irrelevant in this regard – a particularly important point for search engines. Equally, the Court had previously ruled (convincingly) in the Lindqvist judgment that placing data online is a form of ‘data processing’.

While it is less obvious that Google is a ‘data controller’, given that it does not control the original publication of the data, the Court’s conclusion that search engines are data controllers is ultimately convincing, given the additional processing that results from the use of a search engine, along with the enormous added value that a search engine brings for anyone who seeks to find that data. In this sense, Google is a victim of its own success.

Similarly, as regards the territorial scope of the Directive, it would be remarkable if Google, having established a subsidiary and domain name in Spain and sought to sell advertising there, would not be regarded as being ‘established’ in that country. The sale of advertising in connection with free searches is, of course, the key element of Google’s business model (leaving aside the many other companies, such as YouTube and Blogger, that Google has acquired over the years), and making money is surely one of the ‘activities’ of any business that aims to make profits.

The separate liability of Google as a ‘data controller’ obviously justifies the Court’s conclusion that it might, in appropriate cases, be required to take down material from its search engine results that infringes the data protection directive. This is most obviously relevant where that data is inaccurate or libellous, but that is not the case here, where the personal data is simply embarrassing.

So, in the absence of another legitimate ground for processing (which will normally be the case as regards search engines), the case ultimately turns on the balancing of interests between the data subject, the search engine and other Internet users. And here is where the Court’s reasoning goes awry.

In its previous judgment in ASNEF, the Court ruled that Spanish law failed to apply the correct balance between data subjects and direct marketing companies, because by banning any use of personal data which was not already public, it implicitly did not give enough weight to the company’s right to carry on a business. But here the Court makes no reference to that right, even though Google’s methods are as central to its business model as the use of private personal data is for direct marketers. Indeed, Google’s highly targeted advertising (not as such an issue in this case) is itself obviously a form of direct marketing.

Also in ASNEF, the Court criticised the Spanish law for its automaticity, because it failed to weigh up the interests of companies and data subjects in individual cases. But in Google Spain, it is the Court which sets out an automatic test: the economic interest of the search engine is overridden if the individual is not a public figure.

The interests of other Internet users are only briefly mentioned, even though Article 7(f) requires only a balancing of interests between not only as between the data controller (ie, the search engine in this case) and the data subject, but also as regards third parties to whom the data are disclosed, ie the general public. Oddly, the Court does not expressly refer to the Charter right to freedom of expression (it’s in Article 11 of the Charter), and does not expressly link its statements about the balancing test to the case law of the European Court of Human Rights on the best way to balance privacy and freedom of expression.

Furthermore, unlike in ASNEF, the Court makes no mention of Article 52 of the Charter (the provision dealing with limitation of Charter rights, including in the interest of protecting other rights, which also requires consistent interpretation with the ECHR). It should also be noted that, in deciding the key freedom of expression issue itself, the Court has departed from its prior approach (in Satamedia and Lindqvist, for instance) of leaving it to the national courts to decide on this issue.

The Court’s dismissal of the journalistic exception also contradicts its willingness to agree, in Satamedia, that merely sending personal tax data by text message to nosy neighbours could constitute ‘journalism’. Here, of course, it is not Google which is the journalist; but Google is a crucial intermediary for journalists. If journalism can consist of sending out tax information by text message, it could also equally consist of commenting (for whatever reason, and in whatever forum) on an individual’s past financial problems. And there is no reason why the passage of time should count against the exercise of the right of freedom of expression – although that factor should be relevant, as the Court says, as regards the right to privacy.

 

Consequences of the judgment

Obviously, today’s judgment only concerns search engines, but it may have broader relevance than that.  Its relevance to social networks will soon be considered in another post on this blog. For search engines, those which are less successful than Google might not have an ‘establishment’ within the meaning of this judgment, which raises the question of whether they would otherwise have an establishment, use equipment on the territory, or can be covered due to the Charter.

More broadly, any non-EU company with a subsidiary selling advertising in an EU Member State in connection with its Internet services must obviously be regarded as covered by the data protection Directive by analogy with this judgment, without prejudice to those broader possibilities.

As for those search engines which do fall within the scope of the judgment, most obviously Google, it seems that their legal obligations are considerably greater than what they had thought them to be. They must respond to individual complaints that the personal data which can be found about that individual is simply too old to be relevant any more, whether it is accurate or not, and they can be challenged before the courts or a supervisory authority if they do not comply.  In fact, an individual could also take action to this end before a supervisory authority.

Could a supervisory authority act of its own motion to enforce this judgment? Probably not, because the rights at issue in this case are triggered by individual complaints. Some people assiduously search Google to see what results they can find on themselves; in this context, I should point out that I am not the same ‘Steve Peers’ from Essex who has been convicted for non-payment of council tax. But others are unaware of, or don’t care about, or couldn’t be bothered to challenge, or are positively thrilled about, the existence of old information about them which can be found by means of using Google.

So not everyone who might conceivably be embarrassed by such old information will complain to Google, but a considerable number are likely to do so. Google’s liability extends to responding to such individuals, but not to completely changing the way it processes personal data in the absence of such complaints.

Interesting questions may arise, however, as regards the interpretation of the rules set out in the judgment: what exactly is a public figure, and how long has to pass before personal data is no longer relevant? For instance, a job applicant can certainly object to Google if its search results include pictures of her dancing drunkenly on a table in 1998. But she could hardly argue that a record of last night’s debauchery must be ‘forgotten’  already – even if she cannot remember it herself.

Such disputes may well prove an opportunity to argue that the remit of this judgment is narrower than it first appears, or even to request (which any national court can do) that the Court reverse at least some aspects of its judgment. For now, however, the CJEU has established a potentially far-reaching right to be forgotten, with possible significant impacts at least on the activity of search engines. While in the Lindqvist judgment, the Court was keen to ensure that the data protection Directive was adapted to the reality of the Internet, in Google Spain it seems to demand that the Internet should rather be adapted to the Directive.

As for the initiative to amend the Directive (to be replaced by a general data protection Regulation), this judgment might speed that process up, since Internet companies now have an incentive to use the process as an opportunity to limit their liability compared to what it would otherwise be – rather than (before the judgment) an interest in slowing the process down, in order to avoid an increase in that liability. Time will tell what the result of that negotiation will be.

Barnard & Peers: chapter 9

 

Google Spain: ECJ has straightjacketed the librarian

In Law, Media regulation, Research, Uncategorized on May 13, 2014 at 3:17 pm

By Peter Noorlander

In 1997, the US Supreme Court in Reno v. American Civil Liberties Union described the Internet as a dramatic new marketplace of ideas, referring to it as a “vast library” of millions of publications and “the most participatory form of mass speech yet developed”. It made everyone a publisher – “[a]ny person or organization with a computer connected to the Internet can ‘publish’ information”.

It followed that, under the First Amendment, the Internet was entitled to maximum protection and the Court cautioned against imposing regulations that would unduly restrict speech (Reno v. ACLU concerned the restriction of access to sexually explicit content, which the US government argued was necessary so as not to discourage parents from allowing their children access to the web. The Supreme Court strongly disagreed, stating that “[t]he interest in encouraging freedom of expression in a democratic society outweighs any theoretical but unproven benefit of censorship”).

The estimated number of websites in 1997 was just over a million. Yahoo! and Amazon were in their infancy, but Google, YouTube, LinkedIn, Facebook, Wikipedia, Paypal and most other names that are now synonymous with the web had not yet been launched (and Alibaba, about to be launched in potentially the largest IPO seen in the tech world, was the merest glint in its founder Jack Ma’s eyes).

Fast forward to 2014. The world wide web literally crawls with information, some of which are of a private nature. The “vast library” referred to by the US Supreme Court is now of a size that is beyond human comprehension. It is estimated that there are a billion websites in the world today and more are added every day.

It is impossible for anyone to make sense of or find their way among all this information without the help of a search engine. Search engines are crucial in making the promise of the internet a reality. It means that an individual blogger or journalist who publishes through a blog has a decent chance of their information being picked up (particularly if they are writing on a subject that few others write about).

Today, the ECJ has identified the very function of the search engine as a serious threat to personal privacy and data protection. The ECJ’s decision in the Google Spain case argues that it is one thing for personal information to be out there on the internet, but that indexing all this information and presenting it in coherent form potentially infringes the right to privacy. The Court has held that “as a rule” the interests of an individual in protecting privacy will outweigh the interest of the public in receiving that information, even when lawfully published, and so the individual may ask Google (or another search engine) to remove that information from its search results.

In doing so, the ECJ has taken up a position that is almost diametrically opposed to that of the US Supreme Court. Where the US Supreme Court examined the situation from the vantage point of protecting free speech and society’s interest in the free flow of information, the European Court of Justice has come at this from the angle that personal rights to data protection and privacy are paramount. While it acknowledges that there may be a societal interest of access to information in some cases, in particular where the information concerns the role of a “data subject” in public life, it holds that as a rule, the privacy interest overrides societal interests of access to information – even lawfully and legitimately published information.

The ECJ refers only to Articles 7 and 8 of the EU Charter of Fundamental Rights – the rights to privacy and data protection. There is no mention of Article 11, the right to freedom of expression and to receive information. There is an acknowledgement of the importance of journalism, but no apparent awareness of the importance – in the internet age – of the right to freedom of expression and how that right is realised.

The implications of the judgment are massive. Google has been put on the spot but other search engines are affected too. Anyone who feels that information which is no longer ‘relevant’ to their current situation – be it an old conviction for shoplifting, the beating of a spouse, or a conviction for corruption – will be in a strong position to approach Google and request that the page listing that information is de-indexed.

The Supreme Court referred to the World Wide Web as “comparable, from the readers’ viewpoint, to both a vast library including millions of readily available and indexed publications.”

This remains unchanged – the vast library is still there. And its growing. But the indexing is under serious threat – the European Court of Justice has straightjacketed the librarian.

The real loser in this case: the public and the interest in society at large in the free flow of information and ideas.

Google Spain: Freedom of Expression and the Right to be Forgotten

In Law, Media regulation, Research, Uncategorized on May 13, 2014 at 11:49 am

By Professor Lorna Woods

Case C-131/12 Google Spain SL, Google Inc. v. Agencia Española de Protección de Datos (AEPD) and Mario Costeja González, judgment 13th May 2014

The ECJ today handed down a case in a landmark decision regarding data protection and the Internet. The EU Data Protection Directive (DPD) establishes a system which controls the way in which data processing in the EU is carried out, giving the data subjects certain rights to correct data and to object to it. As well as issues as to the meaning of these rights, this case raises important questions about the scope of the directive – in terms of geography (its impact on non-EU resident processors) and in terms of the activities caught by it (what is processing and who is a ‘controller’). It has repercussions in terms of the applicability of EU law even to non-EU-based companies and the rights that individuals have against those processing their data – and it is clear now that this is not just first publishers of material but those who republish – including search engines.

The facts of the case are simple. A newspaper published a notice of auction in respect of the property of Mario Costeja González for unpaid debts. He subsequently paid the debts and the property was not auctioned. Some ten years later Google searches on his name bring up the newspaper advertisement. The Spanish courts had not agreed that the newspaper archive should be amended as the advertisement and so he brought an action to require Google to suppress these results. While AEPD agreed, Google appealed to the Audiencia Nacional (National High Court) which referred questions on the meaning of the DPD to the ECJ.

The ECJ first considered whether Google’s activities fell within the scope of the DPD. Google had argued that search engines do not distinguish between data protected by the DPD (personal data) and other data, and that furthermore it had no control over the data or the selection of the data. It therefore argued that it did not ‘determine […] the purposes and means of the processing of personal data’ as required by the terms of the DPD. The ECJ rejected these arguments. Firstly it was not contested that the data included ‘personal data’ which was processed: the fact that there was non-personal data in the search engine operations too seems irrelevant, as is the fact that the search engine results are of material that has already been published and is unaltered by the search engine, even where that publication was by the media. So, ‘a search engine ‘collects’ such data which it subsequently ‘retrieves’, ‘records’ and ‘organises’ within the framework of its indexing programmes, ‘stores’ on its servers and, as the case may be, ‘discloses’ and ‘makes available’ to its users in the form of lists of search results’ (paras 28-29). The ECJ emphasised that a broad definition must be given to ‘controller’ to ensure complete protection for data subjects. In assessing this, the ECJ looked not only to what Google does in terms of the organisation of the search engine but also the impact of the search engine on linking individuals to results. The ECJ concluded that:

as the activity of a search engine is therefore liable to affect significantly, and additionally compared with that of the publishers of websites, the fundamental rights to privacy and to the protection of personal data, the operator of the search engine as the person determining the purposes and means of that activity must ensure, within the framework of its responsibilities, powers and capabilities, that the activity meets the requirements of Directive 95/46 in order that the guarantees laid down by the directive may have full effect and that effective and complete protection of data subjects, in particular of their right to privacy, may actually be achieved (para 38).

As regards territorial application, two possibilities arose. The first was that Google’s activities fell within Article 4(1)(a) which applies the DPD to those with an ‘establishment’ in a Member State. The second relates to Article 4(1)(c) which concerns the ‘use of equipment situated on the territory of the said Member State’. The Advocate General focussed on the first aspect, suggesting that although the processing is carried out by Google in the USA, it still had an establishment in Spain which although not processing data constituted an important element in Google’s business model by selling advertising thereby satisfying the requirement that the processing be ‘carried out in the context of the activities’ of the EU establishment. The ECJ seems broadly to have agreed, again emphasising the purpose of the DPD and the need to protect privacy (paras 53-54). The ECJ concluded:

the activities of the operator of the search engine and those of its establishment situated in the Member State concerned are inextricably linked since the activities relating to the advertising space constitute the means of rendering the search engine at issue economically profitable and that engine is, at the same time, the means enabling those activities to be performed (para 56).

The ECJ then turned to whether Mr Costeja González had the right to ask for the information to be removed from the search results. The ECJ noted that – subject to limited exceptions – all data processing must comply with data quality principles found in Article 6 DPD; and with one of the legitimacy criteria in Article 7 DPD. Article 7 permits the processing of personal data where it is necessary for the purposes of the legitimate interests pursued by the controller or by the third party or parties to whom the data are disclosed, except where those interests are overridden by the interests or fundamental rights and freedoms of the data subject. By reasoning thus, the ECJ made the exercise one of balance, but bearing in mind the status of the individual’s rights as fundamental rights (para 74). The ECJ once again highlighted the reach and impact of the Internet and of search engines in structuring information, to hold that:

‘[i]n the light of the potential seriousness of that interference, it is clear that it cannot be justified by merely the economic interest which the operator of such an engine has in that processing. However, inasmuch as the removal of links from the list of results could, depending on the information at issue, have effects upon the legitimate interest of internet users potentially interested in having access to that information, in situations such as that at issue in the main proceedings a fair balance should be sought in particular between that interest and the data subject’s fundamental rights under Articles 7 and 8 of the Charter. Whilst it is true that the data subject’s rights protected by those articles also override, as a general rule, that interest of internet users, that balance may however depend, in specific cases, on the nature of the information in question and its sensitivity for the data subject’s private life and on the interest of the public in having that information, an interest which may vary, in particular, according to the role played by the data subject in public life’ (para 81).

The ECJ added that given the structuring function of a search engine, it may have more impact than first publication on a web page. On this basis, a search engine operator may be required to remove information. The ECJ took a similar approach with regard to the rights set out in Articles 12(b) and 14(a) DPD, which concerned the right to be forgotten. It held that:

‘having regard to the sensitivity for the data subject’s private life of the information … and to the fact that its initial publication had taken place 16 years earlier, the data subject establishes a right that that information should no longer be linked to his name by means of such a list’(para 98).

This finding is, however, subject to the public interest in having access to the information. While it seems unlikely that there is any such interest in this case, that is a matter for the referring court to determine.

In sum, this is a resounding win for Mr Costeja González and the Spanish data protection authorities.

Google Spain: Blog Roundtable and Panel Debate

In Events, Law, Uncategorized on May 12, 2014 at 2:09 pm

On 13 May, the Grand Chamber of the European Court of Justice is due to hand down its long-awaited judgment on Google Spain. This case raises several interesting human rights issues, including the concept of a ‘right to be forgotten’. Over the next few days we will post reactions to the judgment by different commentators. Today’s first post, by Lorna Woods, presents the background to the case, and highlights a number of the key issues.

On 20 May, the Centre for Law, Justice and Journalism (CLJJ) and the Human Rights Centre at Essex University will hold a panel debate at City University London to debate the implications of the judgement.

Google Spain – Background

In Law, Uncategorized on May 12, 2014 at 12:43 pm

By Professor Lorna Woods

Background

The case before the ECJ arises from a reference from a Spanish appeal court, the Audiencia Nacional, and concerns the applicant’s claim that Google should remove certain information from its search results. In 1998, a Spanish newspaper had – at the direction of the relevant Spanish court – advertised that property of the applicant was to be auctioned in relation to the applicant’s social security debts. The applicant paid the debts and the foreclosure did not take place. Yet, approximately a decade on, a search on the applicant’s name revealed the newspaper notice of the foreclosure. The applicant applied to the Spanish Data Protection Authority (AEPD), asking for an injunction against both the newspaper and the search engine. The AEPD dismissed the claim against the newspaper (which was under a legal obligation to publish the official notice), but issued an injunction against Google Spain SL and Google Inc. to delete the data from the search engine’s index. Google appealed to the Audiencia Nacional which referred 9 questions on the interpretation of the Data Protection Directive (DPD) and the Charter of Fundamental Rights to the ECJ. Note also that the e-commerce Directive, which normally provides safe harbours for internet intermediaries, excludes data protection issues from its scope.

This case is not an isolated incident; reports suggest that there are 130 similar cases in Spain and the problem of longevity and prominence of old and sometimes inaccurate or misleading information does not just affect Spanish citizens. The questions referred, while relating to the interpretation of the DPD raise fundamental questions about the nature of the internet, who makes decisions about content and on what basis.

The hearing took place on 26th February 2013 and the Opinion of the Advocate General released on 25th June 2013.

Questions Referred

The questions of legal interpretation referred can be divided into broadly three categories:-

  • Jurisdiction
  • Nature of Google’s activities
  • Obligations under the DPD, namely the existence of the ‘right to be forgotten’.

As regards these questions, the Opinion of the Advocate General can be summarised as saying that

  • Yes the DPD and Spanish implementing rules can apply to Google in these circumstances
  • Yes, Google processes data, but no it is not a data controller; and
  • Even if the DPD applies to Google’s activities, there is no ‘right to be forgotten’.

Jurisdiction

Normally national laws (here the Spanish rules implementing the DPD in Spain) do not apply outside the territory of the state enacting those rules. So the question here is whether Spanish laws apply to Google. The analysis here was complicated by the fact that Google’s search engine is operated solely by Californian-based Google Inc. The only presence Google has in Spain is a subsidiary, Google Spain SL, the activities of which are limited to promoting and selling advertising space on the search engine.
The relevant provision is Article 4 DPD, specifically Article 4(1)(a), which provides a Member State is to regulate when:

The processing [of data] is carried out in the context of the activities of an establishment of the controller of the data on the territory of the Member State…..

and Article 4(1)(c), where a Member State should regulate when:

the controller is not established on Community territory and … makes use of equipment, automated or otherwise, situated on the territory of the said Member State… .

The referring court questioned whether Article 4 is satisfied by the fact that Google uses its crawlers to gather information from websites hosted in Spain, or from the fact that it uses a Spanish country code TLD (Google.es) and directs Spanish users to searches and results relevant for them in terms of language or location and whether the “use of equipment” to trigger Article 4(1)(c) can be inferred from the fact that Google stores the indexed information in servers the location of which is undisclosed. The referring court also asked whether the Charter could extend the jurisdiction beyond the circumstances expressly envisaged in the DPD.

Google’s Activities

The DPD imposes obligations on ‘data controllers’ who process personal data, so the question is whether Google’s activities fall within the definition of processing found within Article 2(b) and 2(d) DPD. It describes processing thus:

Any operation or set of operations which is performed on personal data, whether or not by automatic means, such as collection, recording, organization,, storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission, dissemination or otherwise making available, alignment or combination, blocking, erasure, or destruction.

A controller

‘determines the means and purposes of the processing of personal data’.

Assuming that Google is a data controller processing personal data, the court asked whether the AEPD court require Google to remove the results without also asking the newspaper so to do.

Right to Be Forgotten

The final set of questions relates to the scope of the rights of data subjects. The DPD provides individuals with rights to obtain the “rectification, erasure or blocking of personal data” which are “incomplete or inaccurate”. The questions referred specifically question whether the rights to the erasure or blocking of data (Art. 12(b) DPD) and to object to the processing of data (Art. 14(a) DPD) allow the data subject to prevent search engines from indexing personal information of not because it was illegal but purely because it was damaging (and outdated).

Opinion of the Advocate General

The Advocate General suggested that where a parent search engine based outside the EU is a data controller, its EU subsidiary must also be considered to be a data controller on the basis that the EU subsidiary would be acting as a ‘bridge’ for the search engine function to that member state’s advertising market. The Advocate General stated that ‘an economic operator must be considered a single unit…[and] not be dissected on the basis of its individual activities related to processing of personal data’. This means in principle the Spanish laws can apply. This has potentially significant repercussions for internet intermediaries with a similar business model. It raises some interesting questions about the country of origin principle in the DPD and the extra-territorial reach of the DPD. Certainly, it is inconsistent with the approach taken in some national courts relating to Facebook (though the national courts have not themselves taken a consistent line). This issue would be dealt with by the proposed Data Protection Regulation, Article 3.

Nonetheless, on the facts, the Advocate General concluded that while Google was processing data, it was not a data controller – or only partially. Before coming to this conclusion, the Advocate General made some remarks about the potential scope of application of the DPD, suggesting that it was too broad and its application. He suggested that in applying the DPD, the ECJ should

“apply a rule of reason … the principle of proportionality, in interpreting the scope of the Directive in order to avoid unreasonable and excessive legal consequences”.

This novelty in interpretation of the DPD seems to run against the views of those arguing for the review of the data protection regime on the basis that it was not stringent enough. According to the Advocate-General, a data controller must be aware of a defined category of data and process with some degree of intent in respect of that data in order to be a controller. The Advocate-General argued that Google is not ‘aware’ of the actual personal data on those third party websites, nor is it intending to process that personal data in any ‘semantically relevant way’. The obligations in the DPD do not apply directly to it, but would only bite if search engines processed personal data in a manner inconsistent with the instructions or requests of publishers. Google is, however, processing data as regards the index of the search engine. The Advocate General concluded that in relation to this data, Google had satisfied its duties under the DPD.

The Advocate-General also considered the general balancing of rights – in particular the balance between the right to privacy/right to reputation and freedom of expression, and argues against a right to be forgotten. The Advocate General stated that a right to be forgotten ‘would entail sacrificing pivotal rights such as freedom of expression and information’. According to the Advocate General, the right to erasure and blocking under article 12(b) applies to incomplete or inaccurate data and the right to object under article 14(a) arises only where there are compelling legitimate grounds. A data subject’s wish to restrict the dissemination of true and accurate public information on the grounds that it is harmful or contrary to his interests does not satisfy this requirement. There is no general right to be forgotten. Again, this issue would have been dealt with by the proposed Data Protection Regulation, though it should be noted that the proposed right was controversial and has been watered down from the original proposal. Nonetheless, the national courts in some Member States (such as France) seem to recognise some form of right to be forgotten.

The Opinion of the Advocate General is very much a policy driven opinion; and hard to square with the wording and stated purpose of the DPD. His approach raises interesting questions about all three issues, as well as the balance between freedom of expression and the right to a private life/reputation which will have more general repercussions.

General Issues

 

  • Which laws apply to transnational digital service providers? Will EU rules become global standards?
  • What is the balance between freedom of expression and other rights and interests – and who speech rights are we concerned with?
  • Who is responsible for moderating content: publisher or re-publisher/location tool?

Further Links

 

Peter Fleischer, Google’s Privacy Counsel has some personal comments on the right to be forgotten, while over at SafeGov, Richard Falkenrath also has some further thoughts on the right to be forgotten.

Open justice in the digital era and data protection

In Events, Journalism, Justice, Research on June 5, 2013 at 10:21 am

By Judith Townend

I had the chance to discuss the Centre’s ‘Open Justice in the Digital Era’ project yesterday, at the first joint seminar of the DP Forum and NADPO (The National Association of Data Protection Officers).

The theme was ‘The challenges of complying with evolving standards’, and the other speakers included: Martin Hoskins, data protection consultant; Judith Jones, Group Manager, Government & Society, Information Commissioner’s Officer; Robert Bond, Head of Data Protection and Information Security at Speechly Bircham; and Lynne Wyeth, Head of Information Governance, Leicester City Council.

It provided a fascinating insight into the regulatory and legal challenges ahead (especially in view of the EC’s draft General Data Protection Regulation*), both in terms of the theoretical framework and practical issues on the ground for DP officers (whose number is set to increase, if EC proposals go ahead).

I introduced the Centre for Law, Justice and Journalism’s ‘Open Justice in the Digital Era’ project and the privacy-related issues we have stumbled upon, in discussing potential recommendations for more efficient and systematic digitisation of courts information. There are important issues to consider around Data Protection, Rehabilitation of Offenders and the ‘Right to be Forgotten’, a concept included in the draft Regulation.

A quick summary can be found on my Meeja Law blog.

*A vote on on the lead rapporteur’s report regarding amendments to the Proposed Regulation, scheduled for 29 May, has been postponed, as a result of the high number of amendments to consider.

Lorna Woods: Leveson, the ICO and Data Protection

In Journalism, Law, Media regulation, Research, Resources on March 25, 2013 at 8:57 am

By Professor Lorna Woods

One aspect of the Leveson recommendations that seems to have escaped the headlines is that relating to data protection, though implementation of his recommendations could give those adversely affected by media treatment of their personal data some tools.

Section 32 Data Protection Act provides an exception to data processing rules in relation to a number of ‘special purposes’, which includes media purposes.  The scope of the exemption is pretty broad: it provides an exemption to non-compliance with any of the Data Protection Principles except the Seventh Principle (security), the right of access and objection (Ss32(2)(a) Data Protection Act).

This exemption is available provided the press-related data controller believes that the special importance of the public interest in freedom of expression is served by the processing of personal data, and that the processing of such data is with a view to publication.

The terms of the Act in this regard are thus vague and potentially subjective; they do not really give any clear steer on when processing of data might be protected.  Section 32(3) specifically provides, however, that when considering whether such belief was reasonable, “regard may be had to [a data controller’s] compliance with any code of practice” and provides that such codes may be designated by statutory instrument.

While there are existing codes for journalists (which are not limited to the PCC Code (SI 2000/1864), but include those put together by other media organisations, e.g. the BBC), they are not sufficiently detailed guidance on data protection obligations either.  Section 51 Data Protection Act empowers the drawing up of codes of good practice, or encouraging trade associations so to do. On this basis the ICO consulted (close date 15th March) on the intention to produce a code of conduct aimed at media organisations, including but not limited to the press, as it proposed in its response to the Leveson Report.

So given that there are existing codes under the system, what is the big deal about a new code?  Well, if it is designated under s.32(3), then this brings into play the (statutory) enforcement procedures under the Data Protection Act.  Given the monetary penalties that the ICO can now apply, this might get some attention.

More generally, the ICO has committed itself –again in response to Leveson – to “provid[ing] regular updates to Parliament on the effectiveness of the measures we are adopting in response to Lord Justice Leveson’s recommendations and more generally on our assessment of the culture, practices and ethics of the press in relation to the processing of personal data”. This may give evidence about whether any new system of regulation is working which, crucially, comes from outside the system.

This then re-emphasises the importance of the scope of the journalistic exception and the meaning of ‘public interest in freedom of expression’, which is presumably tied in to the fourth estate capacity of the media, rather than its capacity for spreading rumour and gossip.

Further, how closely connected must the processing of the data be to the publication of a story to benefit from the exception? Mr Jay made this point at the Leveson Inquiry:  when the press obtains an ex-directory number (for hacking purposes), is it likely that the press would publish the ex-directory number? The answer is “no”, so presumably processing such material cannot benefit from Article 32.

Read the rest of this entry »

Lorna Woods: Google and Data Protection – Again!

In Comment, Law on March 16, 2012 at 9:52 am

By Professor Lorna Woods

A new reference has landed on the ECJ’s desk: Google Spain and Google (Case C-131/12) from the Audiencia Nacional in Spain.

The ECJ official website is a bit thin on details, but this seems to be the same case reported by Reuters. That case concerns the right to be forgotten – implicit in the current data protection regime (but which would be made explicit were the draft Data Protection Regulation ever to come in to being).

While the judge apparently referred a number of questions, including one about jurisdiction, the central issue is whether Google should be obliged to delete data referring to individuals.  The impetus for the cases comes from aggrieved individuals who have applied to the Spanish data protection authority to have information deleted.  This case is likely to be one that is closely watched given the likely stormy passage of the proposed Data Protection Regulation.

Central to the discussion will be the relationship between the e-Commerce Directive (Directive 2000/31/EC) and the Data Protection Directive. While the e-Commerce Directive shields ISPs from liability in a range of circumstances, that directive is expressed not to apply to ‘questions relating to information society services covered by Directives 95/46/EC and 97/66/EC’ (article 1(5)(b) e-Commerce Directive and Recital 14).  Directive 95/46/EC is, of course, the current Data Protection Directive.

Google is, of course, not unfamiliar with the exception to the e-Commerce Directive, as it arose when directors of Google were charged under Italian data protection laws relating to user generated content (UGC) posted on a You-Tube type service operated by Google.  The UGC was a clip from a mobile phone which showed some boys bullying another boy with Downs Syndrome.  The Google executives were given 6 month suspended gaol sentences.  A decision on appeal was due to be handed down by the Court of Appeal in Milan in 2011, though in September the case, according to one of those involved (Peter Fleischer) had not been assigned.  One would hope that in the interests of timely just that this issue is decided before the ECJ hands down its ruling in Case C-131/12.

If the non-availability of the hosting exceptions, then presumably the key issue is the scope of the rights under the DPD.  Therein lies the rub.  While the DPD is set against a privacy (Article 8 ECHR) backdrop, it does not grant any particular right to be forgotten.  Instead, the DPD provides how data should be managed, which includes the archiving and deletion obligations. How far the ECJ is prepared to push this, especially in the light of data protection as a fully fledged right in the EU Charter, remains to be seen.  This is the new contentious issue in the privacy/freedom of expression debate.  For a range of views see: Google’s privacy counsel; a security consultant; and an academic viewpoint [PDF], among, no doubt, many more.

Follow

Get every new post delivered to your Inbox.

Join 606 other followers