If you've not read my first blog post on the Michelle Obama image fiasco you might want to start there first. I should also point out that I'm going to be referring to some words that people might find offensive in this post - I've used asterisks and don't really think anyone will be upset, but it's your personal call.
Google's concept of images and SafeSearch is fundamentally flawed. Let's take a quick look at how this is supposed to work: "Use Google's SafeSearch filter if you don't want to see sites that contain pornography, explicit sexual content, profanity, and other types of hate content in your Google search results." Note please that we're talking about pornography AND hate content. One could assume from this that if you turn ON the SafeSearch filter you won't see hate content, yet clearly - as demonstrated in my previous post - that's exactly what you get. Yet ironically, if you turn OFF the SafeSearch filter you don't get to see the image. So things work exactly the opposite way around to that which any sane person would expect.
Let's look at how this works with other terms - I ran a couple of searches for sexual terms and another two for racial slurs, because I wanted to see how Google Images SafeSearch filter worked, and the results were very interesting. A search for f*ck gave 20 images with SafeSearch turned off, and 0 with it turned on. The same result was found for c*nt as well. This is exactly what I'd expect to happen. However, when I ran searches for n*gger and w*g, the results were very different, with an overlap of 14/20 and 12/20 terms. Clearly these racist terms are not regarded in the same way as the sexual terms; they appear to be less 'important' if you will. Rather than blanket coverage of 0 results, Google is quite happy to display images for racial slurs, but not for sexual content. Remember the 'hate content' element of SafeSearch? Please forgive my hollow laugh.
Google's search algorithm is also broken. If I want to do a search for Michelle Obama, I want to see images of her, yet the first or second image coming up is not of her at all, it's of an image that may once have been OF her, but isn't any longer. If I do an image search for 'bus' I'm not going to be impressed if I then get given a result for a rocket. I would expect Google to tweak the algorithm to fix that; I wouldn't expect to put up with it. Google should actually come clean and say that the algorithm hasn't worked, and pull the image for no other reason that search accuracy.
However, Google isn't doing that. In their own advert that they've taken out for this search they say "Google views the integrity of our search results as an extremely important priority." Clearly however they DON'T do this, because the integrity of their results here has been blown out of the water, with this wholly inappropriate and inaccurate result being returned.
They then go onto say "Accordingly, we do not remove a page from our search results simply because its content is unpopular or because we receive complaints concerning it." In other words, they are hiding behind their algorithms and saying that it's nothing to do with them. Well I'm very much afraid to say that it is because they wrote those algorithms in the first place! They chose how they were going to work, and they sat and tweaked them artifically until they got the overall result that they wanted. So the results are artifically affected by Google employees anyway! They can also make the point, which they do, that lots of people contact them asking for images to be removed, and they only do this if a law has been broken. It would be easy to deal with this - once 'x' number of people complain about an image, simply move it automatically into another category. The 'x' could be quite high to ensure that a small pressure group couldn't easily manage it, and they could always look manually at the image if necessary. You simply cannot tell me that Google hasn't got the resources available to do this.
We're left with one stark and rather nasty conclusion. Google is happy to block access to unpleasant sexual imagery, but they're not prepared to do so for unpleasant racial imagery. That should make everyone a little uncomfortable.
Recent Comments