'The Changing Landscape for Search Engines After Google Spain': William Malcolm
Duration: 19 mins 53 secs
Share this media item:
Embed this media item:
Embed this media item:
About this item
Description: |
William Malcolm, Senior Privacy Counsel, Google delivers the second lecture from the "The Changing Landscape for Search Engines After Google Spain" section of the "EU Internet Regulation After Google Spain" conference.
This conference was held at the Faculty of Law, University of Cambridge on 27 March 2015, and brought together leading experts on Data Protection and Privacy from around the World. The conference was held with the support of the Centre for European Legal Studies (CELS). |
---|
Created: | 2015-04-14 18:13 |
---|---|
Collection: | Google Spain Video backup MOVED |
Publisher: | University of Cambridge |
Copyright: | William Malcolm, Mr D.J. Bates |
Language: | eng (English) |
Transcript
Transcript:
Thank you Mr Debeuckelaere for those remarks, thank you Mr Chair, and thank you to David and Julia and Cambridge for bringing together this great group of people to discuss and debate what I think will continue to be some critical issues over the next few years. The speakers on the first panel expertly set out and clearly set the background to the case and all the issues, so I’m, in the interest of brevity, not going to touch on that. I intend to talk about Google’s response and to give some insight into my practical experience and Google’s practical experience in implementing the judgement.
Right from the start, right from this landmark ruling, Google made it clear that, although we didn’t exactly welcome the judgement, we respected it. And it was our job to make it work. I’m very proud of the hard work that our teams have put in over the last ten months to give effect to the individual rights that the Court confirmed in this judgment.
As of 23rd of March, and as is publicly available in our Transparency Report, which I will talk more about in a minute, Google had received 843,000 individual delisting requests with respect to URLs, representing nearly 232,000 individual requests. Roughly speaking, we delist in 41 per cent of cases, and decline to delist in 59 per cent of cases. We publish a full Transparency Report at Google.com/transparencyreport where you can see these statistics which are regularly updated in terms of the volumes of requests we’re seeing, and what are our removal rates look like. There is also a national breakdown, so you can see what those statistics and percentages look like at a country level.
So that’s a lot of volume of stuff, and I think it’s fair to say that’s a lot of work. To individually assess 843,000 URLs takes a bit of doing, and, you know, we move very quickly to comply. We were very quick to launch a web form, setting up a process to manage these requests. We were very quick to engage with data protection authorities to hear what they had to say on the subject. And we listened to a wide spectrum of views through the advisory council process that we established.
Now let me say that our approach has been largely consistent with the recommendations of European experts and regulators, but there are still areas of disagreement. Some of which I will touch on when I talk today. Conflict of laws and jurisdictions are never easy, since fundamental rights are at stake, and fundamental rights are weighed differently in different countries and different parts of the world. But we are committed to listening to the debates across Europe as this issue evolves.
So moving on to our experiences of the last ten months. The first thing that we did after we read the judgement was to stand back and say, okay we need to have a way of receiving these requests. We were conscious that we needed the right amount of data to do the job. We didn’t want to create an open channel where basically individuals would supply more information than was relevant for our purposes in assessing their request. And so we thought to ensure that we were only collecting the right data from individuals, we would launch a web form. And we put a great deal of thought into the design of that web form and how it was structured. I will call out the main pieces of information we ask for when someone wants to file a delisting request. We ask for the name used for search, more about that later, we ask for the contact email address. We also ask for an explanation for each URL, again more on that later. And then we also basically then remove all name query searches, so that was very clear from the Court’s judgement, that the ruling was limited to name query searches, so not removing links for any or all search result pages, which could be overbroad, and this is clearly not required by the ruling.
We focused on EU users. Our web form makes it clear that individuals need to select a relevant country. Practically, individuals will need some connection to that country, which will normally but not always mean that they have to be resident in it. Individuals need to select a country so that we know which law to apply, because there are divergences of practice with national authorities, as I will come on to in a minute. So it is clear which DPA the complaints should be remitted to. That’s a practical problem, and our solution is the web form.
We focused on EU domains. We currently remove in EU plus EFTA states. We noticed early on that some data protection authorities called for pan-EEA consistency, and we wanted to support that effort. The most logical legal interpretation of this Opinion is actually for national removals, but for Google we thought it was right to take a pan-EU approach to encourage consistency and harmonization for individuals. When we remove a search result related to an individual’s name, it will simultaneously be removed from all European versions of Google search. We do not remove on services targeted to non-European countries, including our US service on .com. When individual search on .com we already redirect them to the local relevant domain. In practice the vast, vast majority of our users use these local domains.
We do not think the Court’s ruling is global in reach. It’s an application of European law that applies to search services offered to European consumers. We have a long established way of complying with country-specific laws by removing from the version of our service that targets that country. For example, Google.de in Germany. This is how we have always processed national law removals for national law claims, like hate speech, to use one example, and defamation, to use another. The services on those domains are tailored for users in those countries in a number of ways. It’s not just about legal compliance. They are intended to be the best experience for the user in that country overall.
Another key aspect of Google’s implementation of this judgement - we felt very deeply that we needed to be transparent about both the results, and about the process that Google was running. So we have a generic notice at the bottom of our search results that when a user enters a name query search for most names about a person, that information will be displayed. And now let me be clear that the notice that fires on the bottom of our search results page is not a notice that is fired with respect to any specific removal. It is a notice that is fired with respect to most name query searches. And we think it’s important to give our users information about the results that they are seeing, and how those results have been compiled.
Also, we think it’s important to notify webmasters. This is consistent with the approach in other removals. We are giving webmasters the link or URL that will no longer appear in search results as a response to a query research, not any details of the request. We have long done this in other areas of law, not just for removals made on data protection grounds. We have also let people know on the web forms so that they are aware that this will happen. We believe it’s important to let third party publishers know when we stop linking to their sites in response to some queries. And we have already started seeing complaints from webmasters about the prospect of removing links to their sites, and we are already facing challenges from publishers about removal decisions that result in reduced traffic to their sites.
We provide this feedback to ensure transparency and address those criticisms directly. We have received communications from webmasters that has caused us to re-evaluate removals and reinstate them, and in some situations third party publishers may want to publish the underlying content. With the right to be forgotten, of course, we as Google, the data controller with respect to search, have a legal obligation to assess each case. However, and sometimes, you know, users may get the perception that filling out a form on Google removes it from the original source. And so actually notifying webmasters may alert the original source to the user’s position with respect to the material in a way that actually produces a practical result for the individual. In others, webmasters can identify whether an accusation takes traffic away from their site, or was mistaken, or was inaccurate.
Next I want to turn to this issue of what kind of information we have when we make the decision. Clearly, there is this large carve-out for public interest and we had to decide how to apply that. When we assess a request we have the information from the web form, and we have the material from the site. We do not have any information from the publisher or speaker. And we think it is important to ensure balance in the process that we have that opportunity. There is of course no journalistic exemptions for search engines. That was made clear by the ruling. But at the moment there is no established way for a publisher or speaker to feed back or to be aware that a particular name query search has been delisted.
We will continue to give careful thought to these issues, but we believe we are taking the right approach. However, we recognise that there is a spectrum of strongly held views on these issues across Europe within the privacy community, and even differing views among European data protection authorities. As we continue to discuss these issues with data protection authorities and others, as we evolve our processes, we will, you know, continue to keep an open door and an open mind as to what comes next. For example, we recently introduced a policy not to send webmaster notifications to certain categories of sites, such as malicious porn sites as I have noted previously.
As most of you know, the criteria laid down by the CJEU were fairly vague. We worked hard to develop criteria to apply to the myriad of real world situations, some of which I am going to talk about, which we faced when dealing with the requests that came before us. It was a broad ruling with little guidance on application. Our challenge was to evolve our approach. We accept that our policies and practices will change over time based on what we hear from data protection authorities and what we hear from courts. In that respect, we welcome the guidance of the Article 29 Working Party. We were comforted by the fact that much of the removal criteria was similar to the removal criteria that we had already developed and were implementing. And actually, that consistency between the approach we were taking, and the recommendations of the Working Party was comforting for me and others at Google.
I want to turn a little bit to the guidelines and some of the ways that Google thinks about the issues internally, and some of the trends that we’re seeing. We want to be thoughtful and pragmatic about where we decline to delist. A big area is public figures where we have a general expectation that we will do fewer removals. So I’ll give you a couple of examples of cases where we refused to remove. A footballer who wanted to remove a news article about his career highlights, a TV star who wanted us to remove news articles about a recent sex scandal. There can also be a figure in the public eye because of what they do in their professional life. We have had removal requests covering a respected scientist who wanted to remove criticism of his scientific work. And, you know, there are challenges here. But even public figures, you know, basically, when you’re looking at these news stories, you have to take into account that they have a public personae and a private personae. And some of the calls are difficult, and we are seeking to develop more nuanced criteria as we move forward.
Another area of contention is news stories. When someone is mentioned as a meaningful part of a news story, again that’s a real indicator for us that that might be something that we would decline to delist. If the source is a reputable news story, if we are dealing with a recent article, then, you know, clearly, generally having access to this information we feel is in the public interest.
So there are challenges around that. Another area of challenges is political speech, and to give you some examples of areas where we have pushed back. Members of the government requesting the removal of news articles about their corruption scandal, police officers involved and being convicted of bribery and corruption or having disciplinary charges in relation to bribery and corruption levied against them. Pushing back on a request from a member of government requesting the removal of posts of citizens criticising policies. So these are real examples. And there are really, really difficult examples in political speech. We get a lot from people who want to clean up their past at university. They say I was involved in a political society at university, and, you know, I’m no longer active in public life and I want to remove or delist all name query search information in relation to the statements I made at that time. In some cases they say that when they are in fact running for political office. And in some cases they say that when, you know, you know, when clearly what they are doing is trying to limit the field of information that is available online. So these are challenges and where we draw the line on these is something that we will continue to evaluate.
I want to move on to some trends, then, and some issues we are seeing. Complete volumes to data protection authorities from what we can see at this point are relatively low in relation to the 840-odd thousand URLs we’ve received removal requests for. Very low. I put them in the hundreds. I see every one of them personally, and I put them in the hundreds. But let me try and draw out some of the things we are seeing. We got some data protection authorities who are ordering us to remove government records, simply on the basis that the government site is the right place to find that government record, and that there is no public interest in linking from a search engine in response to a name query search. We’ve got some complex cases involving defamation where it is not clear to us or the data protection authority whether the content in question is true or not, but we are nonetheless being ordered to remove. And again I welcome and call out the Article 29 Working Party’s guidance on defamation in that respect. As one might expect, the criteria on past crimes and when it is appropriate to remove a past crime diverge significantly nationally, even if one has common criteria. There are individual rules across Europe and in various countries with respect to the treatment of past crimes and so we are seeing difference in standards there in the way that the data protection authorities are approaching the issue.
Recency is an issue. We often get asked, well, how many years for this and how many years for that. And we have to say we have to judge each individual case in all of its merits. So our approach is much more dynamic than that. We look at a range of factors and we don’t draw hard lines, because that would be inappropriate. And we also have, as I mentioned before, sensitive issues and political content, and these issues tend to cause difficulties. We’ve got one case at the moment where we are asked to remove a re-reported case, so that something that had been removed and then a newspaper has reported on the fact that there was a removal. And we’ve got one request from a data protection authority to remove that re-reported case. So you know, some trends are starting to emerge for sure.
I would also like to call out the work of our advisory council. We welcome all their advice and guidance, and we are considering carefully how to implement that. I would also like to point out that advisory council members do not adjudicate on individual cases. I think there has been some public misunderstanding about that.
So to close, our response will not be static. We know it will change over time and we know that data protection authorities will have guidance for us. We plan to learn from experience. We remain committed to engaging in thoughtful collaboration with the Working Party and with individual data protection authorities to discuss these issues further. In parallel, across Europe, national courts are starting to build a body of jurisprudence to interpret and apply the CJEU decision. Over time, collectively, we are gaining experience in processing removals and developing a better understanding of the implications of the judgement. We know that DPA’s views will differ from our own in some cases just as the DPAs would reach different decisions amongst themselves in some cases. But we will only push a case if there is a public interest in clarifying the position. We know that tough debates lie ahead, such as on scope of removal and the right of publishers in the process. We think it is important to have those debates openly, and respectfully. Our door is open, we’re listening and we want to work with those in the room and data protection authorities as we move forward.
Right from the start, right from this landmark ruling, Google made it clear that, although we didn’t exactly welcome the judgement, we respected it. And it was our job to make it work. I’m very proud of the hard work that our teams have put in over the last ten months to give effect to the individual rights that the Court confirmed in this judgment.
As of 23rd of March, and as is publicly available in our Transparency Report, which I will talk more about in a minute, Google had received 843,000 individual delisting requests with respect to URLs, representing nearly 232,000 individual requests. Roughly speaking, we delist in 41 per cent of cases, and decline to delist in 59 per cent of cases. We publish a full Transparency Report at Google.com/transparencyreport where you can see these statistics which are regularly updated in terms of the volumes of requests we’re seeing, and what are our removal rates look like. There is also a national breakdown, so you can see what those statistics and percentages look like at a country level.
So that’s a lot of volume of stuff, and I think it’s fair to say that’s a lot of work. To individually assess 843,000 URLs takes a bit of doing, and, you know, we move very quickly to comply. We were very quick to launch a web form, setting up a process to manage these requests. We were very quick to engage with data protection authorities to hear what they had to say on the subject. And we listened to a wide spectrum of views through the advisory council process that we established.
Now let me say that our approach has been largely consistent with the recommendations of European experts and regulators, but there are still areas of disagreement. Some of which I will touch on when I talk today. Conflict of laws and jurisdictions are never easy, since fundamental rights are at stake, and fundamental rights are weighed differently in different countries and different parts of the world. But we are committed to listening to the debates across Europe as this issue evolves.
So moving on to our experiences of the last ten months. The first thing that we did after we read the judgement was to stand back and say, okay we need to have a way of receiving these requests. We were conscious that we needed the right amount of data to do the job. We didn’t want to create an open channel where basically individuals would supply more information than was relevant for our purposes in assessing their request. And so we thought to ensure that we were only collecting the right data from individuals, we would launch a web form. And we put a great deal of thought into the design of that web form and how it was structured. I will call out the main pieces of information we ask for when someone wants to file a delisting request. We ask for the name used for search, more about that later, we ask for the contact email address. We also ask for an explanation for each URL, again more on that later. And then we also basically then remove all name query searches, so that was very clear from the Court’s judgement, that the ruling was limited to name query searches, so not removing links for any or all search result pages, which could be overbroad, and this is clearly not required by the ruling.
We focused on EU users. Our web form makes it clear that individuals need to select a relevant country. Practically, individuals will need some connection to that country, which will normally but not always mean that they have to be resident in it. Individuals need to select a country so that we know which law to apply, because there are divergences of practice with national authorities, as I will come on to in a minute. So it is clear which DPA the complaints should be remitted to. That’s a practical problem, and our solution is the web form.
We focused on EU domains. We currently remove in EU plus EFTA states. We noticed early on that some data protection authorities called for pan-EEA consistency, and we wanted to support that effort. The most logical legal interpretation of this Opinion is actually for national removals, but for Google we thought it was right to take a pan-EU approach to encourage consistency and harmonization for individuals. When we remove a search result related to an individual’s name, it will simultaneously be removed from all European versions of Google search. We do not remove on services targeted to non-European countries, including our US service on .com. When individual search on .com we already redirect them to the local relevant domain. In practice the vast, vast majority of our users use these local domains.
We do not think the Court’s ruling is global in reach. It’s an application of European law that applies to search services offered to European consumers. We have a long established way of complying with country-specific laws by removing from the version of our service that targets that country. For example, Google.de in Germany. This is how we have always processed national law removals for national law claims, like hate speech, to use one example, and defamation, to use another. The services on those domains are tailored for users in those countries in a number of ways. It’s not just about legal compliance. They are intended to be the best experience for the user in that country overall.
Another key aspect of Google’s implementation of this judgement - we felt very deeply that we needed to be transparent about both the results, and about the process that Google was running. So we have a generic notice at the bottom of our search results that when a user enters a name query search for most names about a person, that information will be displayed. And now let me be clear that the notice that fires on the bottom of our search results page is not a notice that is fired with respect to any specific removal. It is a notice that is fired with respect to most name query searches. And we think it’s important to give our users information about the results that they are seeing, and how those results have been compiled.
Also, we think it’s important to notify webmasters. This is consistent with the approach in other removals. We are giving webmasters the link or URL that will no longer appear in search results as a response to a query research, not any details of the request. We have long done this in other areas of law, not just for removals made on data protection grounds. We have also let people know on the web forms so that they are aware that this will happen. We believe it’s important to let third party publishers know when we stop linking to their sites in response to some queries. And we have already started seeing complaints from webmasters about the prospect of removing links to their sites, and we are already facing challenges from publishers about removal decisions that result in reduced traffic to their sites.
We provide this feedback to ensure transparency and address those criticisms directly. We have received communications from webmasters that has caused us to re-evaluate removals and reinstate them, and in some situations third party publishers may want to publish the underlying content. With the right to be forgotten, of course, we as Google, the data controller with respect to search, have a legal obligation to assess each case. However, and sometimes, you know, users may get the perception that filling out a form on Google removes it from the original source. And so actually notifying webmasters may alert the original source to the user’s position with respect to the material in a way that actually produces a practical result for the individual. In others, webmasters can identify whether an accusation takes traffic away from their site, or was mistaken, or was inaccurate.
Next I want to turn to this issue of what kind of information we have when we make the decision. Clearly, there is this large carve-out for public interest and we had to decide how to apply that. When we assess a request we have the information from the web form, and we have the material from the site. We do not have any information from the publisher or speaker. And we think it is important to ensure balance in the process that we have that opportunity. There is of course no journalistic exemptions for search engines. That was made clear by the ruling. But at the moment there is no established way for a publisher or speaker to feed back or to be aware that a particular name query search has been delisted.
We will continue to give careful thought to these issues, but we believe we are taking the right approach. However, we recognise that there is a spectrum of strongly held views on these issues across Europe within the privacy community, and even differing views among European data protection authorities. As we continue to discuss these issues with data protection authorities and others, as we evolve our processes, we will, you know, continue to keep an open door and an open mind as to what comes next. For example, we recently introduced a policy not to send webmaster notifications to certain categories of sites, such as malicious porn sites as I have noted previously.
As most of you know, the criteria laid down by the CJEU were fairly vague. We worked hard to develop criteria to apply to the myriad of real world situations, some of which I am going to talk about, which we faced when dealing with the requests that came before us. It was a broad ruling with little guidance on application. Our challenge was to evolve our approach. We accept that our policies and practices will change over time based on what we hear from data protection authorities and what we hear from courts. In that respect, we welcome the guidance of the Article 29 Working Party. We were comforted by the fact that much of the removal criteria was similar to the removal criteria that we had already developed and were implementing. And actually, that consistency between the approach we were taking, and the recommendations of the Working Party was comforting for me and others at Google.
I want to turn a little bit to the guidelines and some of the ways that Google thinks about the issues internally, and some of the trends that we’re seeing. We want to be thoughtful and pragmatic about where we decline to delist. A big area is public figures where we have a general expectation that we will do fewer removals. So I’ll give you a couple of examples of cases where we refused to remove. A footballer who wanted to remove a news article about his career highlights, a TV star who wanted us to remove news articles about a recent sex scandal. There can also be a figure in the public eye because of what they do in their professional life. We have had removal requests covering a respected scientist who wanted to remove criticism of his scientific work. And, you know, there are challenges here. But even public figures, you know, basically, when you’re looking at these news stories, you have to take into account that they have a public personae and a private personae. And some of the calls are difficult, and we are seeking to develop more nuanced criteria as we move forward.
Another area of contention is news stories. When someone is mentioned as a meaningful part of a news story, again that’s a real indicator for us that that might be something that we would decline to delist. If the source is a reputable news story, if we are dealing with a recent article, then, you know, clearly, generally having access to this information we feel is in the public interest.
So there are challenges around that. Another area of challenges is political speech, and to give you some examples of areas where we have pushed back. Members of the government requesting the removal of news articles about their corruption scandal, police officers involved and being convicted of bribery and corruption or having disciplinary charges in relation to bribery and corruption levied against them. Pushing back on a request from a member of government requesting the removal of posts of citizens criticising policies. So these are real examples. And there are really, really difficult examples in political speech. We get a lot from people who want to clean up their past at university. They say I was involved in a political society at university, and, you know, I’m no longer active in public life and I want to remove or delist all name query search information in relation to the statements I made at that time. In some cases they say that when they are in fact running for political office. And in some cases they say that when, you know, you know, when clearly what they are doing is trying to limit the field of information that is available online. So these are challenges and where we draw the line on these is something that we will continue to evaluate.
I want to move on to some trends, then, and some issues we are seeing. Complete volumes to data protection authorities from what we can see at this point are relatively low in relation to the 840-odd thousand URLs we’ve received removal requests for. Very low. I put them in the hundreds. I see every one of them personally, and I put them in the hundreds. But let me try and draw out some of the things we are seeing. We got some data protection authorities who are ordering us to remove government records, simply on the basis that the government site is the right place to find that government record, and that there is no public interest in linking from a search engine in response to a name query search. We’ve got some complex cases involving defamation where it is not clear to us or the data protection authority whether the content in question is true or not, but we are nonetheless being ordered to remove. And again I welcome and call out the Article 29 Working Party’s guidance on defamation in that respect. As one might expect, the criteria on past crimes and when it is appropriate to remove a past crime diverge significantly nationally, even if one has common criteria. There are individual rules across Europe and in various countries with respect to the treatment of past crimes and so we are seeing difference in standards there in the way that the data protection authorities are approaching the issue.
Recency is an issue. We often get asked, well, how many years for this and how many years for that. And we have to say we have to judge each individual case in all of its merits. So our approach is much more dynamic than that. We look at a range of factors and we don’t draw hard lines, because that would be inappropriate. And we also have, as I mentioned before, sensitive issues and political content, and these issues tend to cause difficulties. We’ve got one case at the moment where we are asked to remove a re-reported case, so that something that had been removed and then a newspaper has reported on the fact that there was a removal. And we’ve got one request from a data protection authority to remove that re-reported case. So you know, some trends are starting to emerge for sure.
I would also like to call out the work of our advisory council. We welcome all their advice and guidance, and we are considering carefully how to implement that. I would also like to point out that advisory council members do not adjudicate on individual cases. I think there has been some public misunderstanding about that.
So to close, our response will not be static. We know it will change over time and we know that data protection authorities will have guidance for us. We plan to learn from experience. We remain committed to engaging in thoughtful collaboration with the Working Party and with individual data protection authorities to discuss these issues further. In parallel, across Europe, national courts are starting to build a body of jurisprudence to interpret and apply the CJEU decision. Over time, collectively, we are gaining experience in processing removals and developing a better understanding of the implications of the judgement. We know that DPA’s views will differ from our own in some cases just as the DPAs would reach different decisions amongst themselves in some cases. But we will only push a case if there is a public interest in clarifying the position. We know that tough debates lie ahead, such as on scope of removal and the right of publishers in the process. We think it is important to have those debates openly, and respectfully. Our door is open, we’re listening and we want to work with those in the room and data protection authorities as we move forward.
Available Formats
Format | Quality | Bitrate | Size | |||
---|---|---|---|---|---|---|
MPEG-4 Video | 1280x720 | 2.98 Mbits/sec | 445.87 MB | View | Download | |
MPEG-4 Video | 640x360 | 1.94 Mbits/sec | 289.36 MB | View | Download | |
WebM | 1280x720 | 2.26 Mbits/sec | 338.26 MB | View | Download | |
WebM | 640x360 | 596.37 kbits/sec | 86.92 MB | View | Download | |
iPod Video | 480x270 | 520.43 kbits/sec | 75.79 MB | View | Download | |
MP3 | 44100 Hz | 249.75 kbits/sec | 36.43 MB | Listen | Download | |
MP3 | 44100 Hz | 62.23 kbits/sec | 9.11 MB | Listen | Download | |
Auto * | (Allows browser to choose a format it supports) |