Wednesday, June 25, 2008

Google's obscene machine


The judgement of good taste and social values is something that is usually left up to the individual who is considered to have their own set of ‘good taste’ preferences and can be trusted to know right from wrong.

In the states this might be all about to change. Matt Richtel reporting for the New York Times: 'What's obscene? Google could have the answer', outlines how in the trial of a pornographic web operator jurors have been given insight into most Google’d search terms of the residence of Pansacola. Here they are more likely to Google ‘orgy’ than make searches for tamer and less politically incorrect terms. I had to contain my amusement when ‘watermelon’ was given as the yardstick contrast to ‘orgy’, although perhaps the fruitier connotation is lost on our American counter-parts.

Amusement aside, this case has serious implications for the rest of us in the Google world. Privacy is an obvious area for debate especially in terms of the storing and observing of individual data. Only this week councils in the UK have been reprimanded for using surveillance technologies for ‘minor’ deviant activities such as dog fouling and littering, accused of being; 'intrusive, ineffective and expensive'. The prospect of having measurements and judgements against Google searches can be viewed as invasive. However, it is worth while keeping in mind that the very appeal of Google that is THE ubiquitous search engine is in itself significant and perhaps mean that the site should (rather unsurprisingly) carry the expected insidious inspection of all our actions.

For the record this does not leave me feeling entirely comfortable with how my data is recorded, the loss of control in terms of the where, when and to whom such records are disclosed should be highlighted and are an important topic for public consideration. Of course in the Pansacola community case, it is hard to argue anything against privacy considerations when Internet porn and misappropriate web surfing is involved.


All we can do for now is Google and watch this space…

2 comments:

David Barnard-Wills said...

There's a few really interesting things about this.

Firstly there's the aggregation of data - producing something that is described as the values a community holds - above and beyond the sum of the values any given individual in the community holds in their heads. For me, it falls neatly into the category of 'terrible social science'

Secondly, it's another case of generalisations being made about you because of a collective that you've been identified as being part of. You live in Pansacola? you like orgies. This is why this is a tricky problem to crack with ideas of privacy... because the data is supposedly anonmised, it can't invade privacy right? but the collective decisions definately can have some kind of social impact.

and finally, it's a nice rejoinder to the 'if you have nothing to hide, you have nothing to worry about' position.

you all have something to hide, you're all searching for porn!

Dr Mariann Hardey said...

I agree David, i like the way that alongside the social rejoiner of 'nothing to hide, nothing to worry about' brings forth essential social status that is derived from social responsibility.

OK for those 'in the know' and with a good and responsible level of reflexive awareness, but what about unintentional actions/victims, e.g. someone unaware of the sources they are navigating, other uses using the same IP address, different associations of what is right and wrong.

over generalisataions aside, again the individual is put in the place of being responsible for their actions and knowledge about their own data. But what about the consequences of this? We do not have control on that, nor are there any guidelines about how to implement such an awareness. Maybe something worth Googling?...