Monday, July 22, 2013

No Budget to Block Porn? Confuse the Public and Rope In ISPs...

For the past month or so, the UK government has increased its hot-air output on the subject of online pornography. I hope their aims are admirable (and I have to assume they are), but there seems to be relatively little method and much more madness right now. Where are they going wrong, and what can be done about it?

 Not all porn is Child Abuse. Following two recent, high profile cases where child murderers were found to have viewed child abuse images, there were a number of hasty pronouncements, fuelled in large part by "enthusiastic" press coverage. Most of these centred around "regular" legal pornography. 

This is a problem. Even if most viewers of abuse imagery do also view legal porn it doesn’t follow that viewing legal porn leads to viewing child abuse imagery. Users of illegal drugs also purchase headache tablets in the supermarket - should we ban all painkillers because users might turn to illegal drugs? I fear, however, that good sense makes poor headlines, so we're probably stuck with this crooked thinking.

 It is difficult to decide what is "porn": in order to protect the children, there is a suggestion that ISPs block access to porn "by default" (though there seems to be some weaselling on the cards here with the word "default"). However this happens, the question will arise "who decides what is pornography?". In this case, it won't be the government, as they've devolved responsibility to a private organisation (your ISP) who will further devolve this to a filtering company.

I know a little about the inner workings of one such filter company - we at Smoothwall spend quite some effort on making sure things are as well categorised as they can be. It's a difficult question - one US judge managed to come up with an interesting answer: "I know it when I see it. Our lists aren't perfect, but the "lowest bidder" is likely to be some faceless off-shore corporate who frankly won't give a <censored> if your favourite sports forum has been misidentified as pornographic.

Update: The BBC have picked up on this outsourcing of filtering and identified TalkTalk's filtering partner as Huawei, who have been stuck with the "they must be up to no good because they're from China" tag - a nasty generalisation, but one prevalent in the media right now. It's interesting to note that TalkTalk themselves appeared to distance themselves from Huawei by overplaying links with Symantec (having spoken with industry insiders on this, this is not news...). This shows that we're already seeing a company viewed as "undesirable" making moral decisions on behalf of TalkTalk's customers. See also, wedge: thin end.

Many very popular sites have plenty of porn and ISP level blocking is going to be pretty brutal. I will have a good old nibble of my hat if we get anything better than domain blocking, but if there's full HTTPS inspection, I'll eat the thing whole, and the matching gloves, before moving to a country with a less invasive government (and preferably hot weather, as I will have ingested my hat & gloves).

Let's take an example of why we need granularity to be any good. Twitter. Whilst indulging in a spot of online ornithology, you might enter a search term "great tits". There you go, plenty of porn-over-https on a domain you can't block. Time to legislate seven shades out of twitter, and the next site, and the next...

Finally, lets touch on an old favourite hobby horse of mine: the Internet is not The Web - and there are plenty of non-web services out there, from the old school like NNTP news groups, to the more modern like encrypted peer-to-peer, and a bunch in between where some of the worst images are found. If we aim at google, we're preaching to the choir, they already work with the relevant bodies to keep their results as clean as possible. Again, this is focusing in the wrong place if the real aim is to clean up child abuse imagery.

My suggestion? Make sure the bodies responsible for this sort of thing are adequately funded. I would like to see the creation and distribution of Child Abuse Images come to a complete stop. These latest proposals take aim at two targets though, and when you try to aim at two things at once, one of those shots is likely to miss the target let alone the bulls-eye.

No comments:

Post a Comment