The right not to get caught
For @monadic, who forgot this was happening, and @stephenbonner who asked for a blog post summarising events – a short write up of last Friday’s Open Rights Group event, ‘resisting the all seeing eye’, featuring Cory Doctorow and Charles Stross.
Things got off to a fairly predictable start for anybody who follows Cory’s and Charlie’s blogs and other postings. I’ve not got around to reading Little Brother yet, but that seems to be where Cory’s head still is. Charlie has been immersing himself in the next part of his Merchant Princes series, so it was no surprise to hear him making references to historical architecture and social norms. This was perhaps one of the key take away points of the evening – the contemporary concept of privacy hasn’t been around that long, being mostly a middle class 20th century contrivance, so should we be really all that shocked/bothered that the concept is being changed as new social norms emerge?
As the introductions wound up Cory touched upon the futility (and obnoxiousness) of web filtering, particularly in schools. This resounds for me also in the enterprise context; on the one hand filtering seems to have gotten more and more in the way of doing business, but on the other hand it is becoming increasingly futile as people bring to work WWAN connected netbooks and phones. There were two choice examples of kids hacking around filtering; 1) they would search for ‘proxy’ and then jump to page 75 of the results – no network administrator would bother to go to page 75, 2) kids would pick a random blog with an open comments system and use that for chat. Cory said that we should do more to teach kids about this stuff, then Charlie said we should do more to teach politicians about this stuff, and then the audience participation started…
One of the more interesting questions was whether we would see ‘the end of hypocracy’ – that openness in all areas of life, including politics, would prevent politicians (and government) from behaving badly. Cory gave some examples of why this doesn’t appear to be working, citing 1) the home secretary choosing to upgrade cannabis despite her own confession to using it and advice from the government chief scientist and 2) his own recent writing on how transparency means nothing without justice. For me this raised interesting questions about how justice can be returned to society, and whether society’s expectations around politicians are reasonable? A follow on point was made that it would become increasingly hard to enter politics with a clean sheet, and that this might encourage political dynasties where kids are brought up to enter office, isolated from real life and experience. Along with politics becoming a career (entered in some way directly from higher education) rather than a calling (entered later on in life having gained experience elsewhere) – this could clearly be a problem. What does it take for the public to accept that politicians have some dirty laundry too?
Clearly there was an expectation in the room that issues around privacy should be somewhere on the political agenda. The question I never got around to asking was ‘what will it take for these issues to find their way onto a mainstream manifesto?’. Robin Wilton recently observed some accidental alignment between opposition party proposals and the pro privacy lobby; but here I’m tempted to invoke the corollary to ‘never blame on malice that which can be explained by incompetence’, which is ‘never ascribe to competence that which can be explained by good luck’. Getting back to the points on education, the underlying problem here is that nobody understands this except the geeks, and so nobody cares except the geeks – that’s just not enough votes.
Both Cory and Charlie were quite down on the tabloid press, ‘the red tops’, and their role in steering public opinion, particularly the link between profit and fear mongering. This was linked back to politicians being trapped in a vice over things like terrorist threat warnings – nothing can ever be relaxed or reversed, as then blame can be ascribed. To avoid blame do nothing. On the way home I read Bob Blakley’s piece where he comes up with ‘Blakley’s Law‘, to describe what’s going on here (with specific reference to the porkalypse) – “Every public alert system’s status indicator rises until it reaches its disaster imminent setting and remains at that setting until it is retired from service.”. I know from my time in the military that it’s impossible to maintain states of high alert for more than a few days, and even raised awareness becomes tired in a matter of weeks. It is just ridiculous that we still hear announcements all the time along the lines of ‘in these times of heightened alert (following 9/11, which happened over 7 years ago) blah blah blah’. Of course those announcements are just another part of the state sponsored apparatus of fear/control, in which the press are complicit in playing along rather than calling them out. At least we have Craig Murray.
Aggregation of personal data was a recurrent theme. Clearly the problem isn’t that people surrender nuggets of personal information for specific time bound purposes, but that those nuggets end up sitting around in perpetuity on (public) databases where they can be aggregated and mined (for purposes that may be against the individuals wishes or best interests). Charlie suggested that it would be nice to have a system where data is accessible when it’s less than 2 years old (whilst still current) and more than 100 years old (when the individual is likely dead, but the historians might care). One of the audience pointed out that this sounded like an elaborate DRM scheme, which got Cory going on about the real evils of DRM being in hardware that is outside the control of the user/owner – well dodged that man – you should be in politics ;-) When asked about pratical measures both Cory and Charlie talked a lot about browsers, a little about CCTV, and a bit about RFID.
The next really interesting area for debate was privacy around DNA, and Charlie picked up the batton pointing out that full genome sequencing is getting cheap very quickly. He then went on to explore ‘shotgun sequencing’ that’s going on for ocean samples, and could also be applied to crime scenes. The trip into the future ended with a hypothetical toothbrush that could identity infections before they became symptomatic. Clealy this could have tremendous benefit for public (and individual) health; and also disasterous consequences for health insurance premiums etc. It’s here that we get to the heart of the issue – privacy in this context is about not wanting to surrender or have used against me information that might make me worse off – like that I committed a crime (e.g. speeding) or have genes that might make me a higher risk. Yet at the same time I don’t want speeding motorists to run down my kids, or pay for high risk customers to be using the same life insurance company as me. I want technology to make my life safer and less costly, I don’t want technology to make my life subject to state interference and more expensive – two sides of the same coin.
This brings us to some of the closing points… there is a social contract that exists between us as individuals, between us and commercial entities, and between us and the state. Technology is causing changes to that social contract – changes that we hope will benefit more than they harm. As we look to younger generations, they are being indoctrinated into accepting some of those changes, but often without open and fair debate about what the alternatives might be. Policy shapes the social contract, and the social contract shapes policy; but participants in both policy formulation and engagement in the broader social contract need to be informed of the issues.
Endnote – On the way home I fired an email to Charlie pointing him to Jerry Fishenden’s work in this area. The weekend also brought us a great post from Ben Laurie on the relationship between privacy and social networks.
Filed under: wibble | 9 Comments
Tags: Charles Stross, Cory Doctorow, hacking, open rights group, politics, privacy, transparency