Spoiler alert: many of us are addicted to connectivity and hyper-awareness of the data streams surrounding our physical world. Manners & cultural values currently create cognitive dissonance as we suppress the urge to look at a smartphone while having a face-to-face conversation, which results in distraction, which results in not being fully present with the person we are speaking with.
Google takes a step towards reducing this internal struggle with the introduction of its Project Glass – a heads-up smartphone display that provides information by projecting onto a lens while you pretend to be fully engaged with whatever you are looking at.
The issue with Project Glass in its current interation is that it requires vocal commands to navigate the interface. What will it take to get from Google’s vision all the way to the vision that Cory Doctorow presented in Down and Out in the Magic Kingdom?
The missing link is a technology that already exists. Emotiv has been working on an product to detect EEG brainwaves and translate them into computer commands. Users can manipulate computers just by thinking.
We are one device convergence away from a completely computer-mediated existence! Yeay?
Pingback: Gaga for Google Glasses « Raising an iChild
The ultimate reiteration of this trend is the synthetic neo-cortex extender. Google glasses rock, but it won’t be easy to link deliberate movements with less conspicuous but less conscious efforts to avoid accidental command. Too bad inconspicuous hand commands aren’t used (although I’ve heard of a laser generated virtual keyboard combined with Google glasses).