With facebook's announcement that they're slowly opting all their users into HTTPS, yet another large chunk of the web gets a welcome layer of encryption.
Welcome, of course, as it helps protect users' highly personal data - often all to recoverable by network sniffing tools, and decreases the possibility of cookie hijack. It's by no means perfect, but it's a great addition.
On the other hand, this SSLization of the web universe does pose a threat in businesses and schools alike - with more traffic going over HTTPS, the requirement for web filtering to intercept and decrypt this traffic rises. In many instances, the stark choice is to either block a site completely, or perform an intrusive "Man in the Middle" inspection. These issues are always going to be most keenly felt on BYOD devices where the MitM decryption would be both more intrusive technically, and socially - hey, it's my device, my traffic, keep out!
There are no silver bullets here. Sure, we can identify most HTTPS traffic's ultimate destination (it's facebook, it's google), but many organisations need a finer level of policy of they are to allow these sites - forcing safesearch is an important one for Schools, or for businesses, maybe a restriction on facebook posts.
The creeping tide of HTTPS is not going away - the only thing keeping more large sites from going fully SSL is the cost/speed tradeoff (encryption on that scale can be computationally expensive), but the need for web filtering for an ever more varied set of organisations has yet to wane either.
This is going to be a long and interesting ride... and I would welcome any comments from our readers on what they are doing to work around these problems, or what they think would be the ideal scenario.
No comments:
Post a Comment