I wonder, however, if this isn’t just a case of misdirected body horror. We don’t like cyborg modification, generally. We might wax emphatically about the benefits of cochlear and retinal implants, pacemakers and insulin pumps, and praise the recent breakthroughs in prosthetic limbs. But we are still uncomfortable by them. Apotemnophilia is considered a disease. “Cyber addiction” is considered a disease. Lepht Anonym is denied health insurance, and castigated by medical professionals. We might allow the possibility of a modification as a fix for a debilitating condition, but that is because we consider it to be a debilitating condition itself. One is still “disabled”, if one must be constantly plugged into a machine.
But this is not simply me stumping for the rights of grinders and bio-hackers. My concern is that while there are real issues involved with placing cameras everywhere, our discomfort with Google Glass is drawn by body horror, not fear of surveillance institutions. It is difficult to turn down necessary skepticism, but if it is not driven by the right motivations, it is more akin to fear. In the same way that the outrage against drones is, in some ways, driven by a fear of “evil flying robots” more than a political reaction to technological imperialism, it is more important than ever to think about how cameras and data actually work, whether they are strapped to an aircraft or our faces, our architecture or our appliances.
I’m very optimistic on the future of wearable computing, even if I’m not as excited about this first iteration of Glass, but I also think it’s important that we stop and ask ourselves about the implications of tech on society. Even if we still (and likely will) move forward; we should do so with our eyes wide open - so to speak.