There are times when I almost (but not quite) feel sorry for Google, because whatever it does, it's going to be wrong. Let's quickly go over the basics again before we start. Google is not a search engine, it's an advertising company using search to make money. Google doesn't care about the results that it provides because it's using its algorithms to work out why one result is better than other. It relies on the world to tell it what's good or bad, and it acts merely as a mirror or a reflection of society. Google is not there to give you the 'right' result, since there isn't a right result. In a sense, that's quite admirable, because it's leaving the user to make up their own mind as to the relevance or authority of a site, and it's not making that call for them. That's all good and wonderful as long as the user is capable and able of making that judgement call, and is savvy enough to realise that what they see is not necessary what is true accurate and correct.
So if I ask Google 'did the flimflam band exist?' I'm happy enough for Google to return a result that says 'No it didn't, and this is why'. However, when we get to something like the Holocaust, it's a rather different matter, especially when we see this:
Similarly if you do a search for 'Martin Luther King' you'll get a racist site, and if you recall recently if you searched for 'three black teenagers' you got sites that tended to imply that such youths were criminals. As reported on Gizmodo Google says it is "“saddened to see that hate organizations still exist,” but it does “not remove content from [its] search results, except in very limited cases such as illegal content, malware and violations of our webmaster guidelines.” The Google spokesperson made sure to add, “The fact that hate sites appear in Search results does not mean that Google endorses these views.”" This is disingenuous at best, or if your attitude towards Google is like mine, it's a straight out lie. It's a lie because Google quite happily removes material when asked to by Governments for starters - we see this under the 'right to be forgotten' ruling for starters. Furthermore, Google could quite easy, and DOES remove content from search results when it finds 'Google bombs'. These used to exist a lot in the past - do a search for 'liar' and the first result would be a biography of George Bush or Tony Blair for example. Google realised that it was being 'gamed' and changed the system so that it couldn't happen again.
Google has chosen not to do that this time. It's perfectly obvious that Google is being gamed, with links to that specific result coming from Stormfront itself, the Wikipedia and other right wing/racist websites. As with the MLK example, people link to it, Google decides that's an important indicator of value, and moves the site further up the ranking. In and of itself it's a perfectly acceptable way to help rank results, but as I say, it's open to being abused. I know it's happening, so do you, and so does Google, but Google has chosen to do nothing. I understand why Google doesn't want to remove material - I think the Stormfront site is vile, but that's just my opinion and it shouldn't hold any more weight than that of an extreme right wing racist. Google isn't a global police force, nor should it act like one.
However, it has a level of responsibility. It's responsible to the people who use their service - just because we don't pay for it it doesn't mean they can skip off into the sunset whistling a happy tune. It's responsible to its advertisers and to its shareholders to provide a service that does the job people expect it to - giving them the information that they want. If I want to see racist websites that deny the holocaust, then Google shouldn't stop me seeing them. However, and this is the thing - there's a difference between that, and giving a site a coveted #1 spot in their results because their algorithm is being played. This is neither moral or ethical, it's a cop out. Google is prepared to intervene to remove instant suggestions that imply that Jews are bad people for example, so their holier than thou 'we don't fiddle with our results' is, as I've said, a blatant lie.
Let's look at other engines - Bing has the Stormfront page on their first page of results, but interestingly the summary that they use actually contradicts the Stormfront title. DuckDuckGo has a completely different set of results, all of which start with discussions about holocaust denial, with the Stormfront way down the page. Yahoo (who are they again?) doesn't have it on their first page of results. Yandex does follow the Google line, with the first two results denying that it took place. So it's not a clear cut answer, but there's plenty of ways of insuring that the result which is getting people angry isn't necessarily in a prominent place. So is there any way around all of this?
A solution is obvious - in a small number of cases, such as this, Google could intervene and remove sites in exactly the same way that they have done with the right to be forgotten results, but with one difference. They already have a situation where they show a small number of results but say there are more, but very similar, and does the searcher want the search re-run with all the results available? At the bottom of the page Google could, quite easily, have a link that says 'Some results have been removed from the results but you can view them if you wish'. That way the casual searcher doesn't see racist stuff on their screen, Google hasn't deleted anything, and is still leaving it up to the end user to decide what they want to do with the results they are getting.
Google is making a choice not to do this, and instead has chosen to leave racist results at the top of the screen. That's their call. Yours is to decide if you want to continue to use it or not.
This seems to me to make as much sense as suggesting that dictionaries should only contain nice words, or that the publishers of news should hide nasty stories, or indeed, as the right to be forgotten. As you say, "Google is not there to give you the 'right' result, since there isn't a right result" - Google is a search engine and makes no judgements other than those through its algorithms that are based on terms, links, statistics, etc. Google does not actively GIVE [my emphasis] "a site a coveted #1 spot in their results because their algorithm is being played" - the site ends up algorithmicly at the top for that reason. Moral, social, ethical judgements are the domain of humans. Karen Blakeman has shown in her blog - http://www.rba.co.uk/wordpress/2016/11/15/how-to-write-totally-misleading-headlines-for-social-media/ - how source resources can influence what is shown to describe their web page by the cunning use of invisible false metadata - a 'Google Bomb', if you will. That metadata was added by a human and I suppose some ethical decision was made or ignored.
If Google followed the suggestion above and added a note at the foot of the page about more results being available, they would have to explain themselves in each case - it would be another moral, ethical - and possibly legal - minefield that we would not like.
It would be another story, of course, if a search engine actively promoted such sites.
Posted by: Chris Armstrong | December 15, 2016 at 12:53 PM
Sorry Chris, but your idea that a dictionary is the same as results from a search engine is just wrong. In a dictionary all words have equal merit; obviously not the case with search engine results which are ranked according to an algorithm based on the idea that one result is better than the next. Yes, Google DOES give a site a #1 spot because of the way those algorithms work, and Google can, and does change them on a regular basis.
Moral, social and ethical judgements are made by human beings, and it may surprise you, but Google is, ultimately, run by human beings. They change and tweak the results as they wish - as I clearly explained in the body of the blog post they did this when they got rid of the Google bombs. It's their *choice*, it's not some sort of accident.
You clearly don't have another alternative, so presumably you're quite happy with the current situation. Excuse me when I point out that not everyone is as casual about it as you are.
Posted by: Phil Bradley | December 16, 2016 at 12:39 PM