I’m not sure if Google’s trying to create a consumer product, or some kind of collective monitoring tool at this point. Buried deep in the Glass source code, and unveiled by the fine folks over at Android Police, code exists to put the device in a constant listening mode.
Ron Amadeo over at Android Police:
“OK_GLASS_EVERYWHERE” does exactly what it says on the tin. Enable this, and you’ll be able to say ‘Ok Glass’ on just about any screen. The default Glass setting is to only listen on the “Ok Glass” screen, which is crap. Enabling this makes Glass feel a lot more intelligent – it is always listening.
I’m I the only one who’s watched 2001: Space Odyssey? Seriously? Who wants a piece of technology capable of always listening to what we’re doing, and also able to take a snap shot or video when needed? What’s next complete autonomy?
All kidding aside, like Ron Amadeo points out, having Glass listen to what’s going on makes it a lot more useful. There’s probably a dialog that needs to be had about whether or not the benefits out weigh the privacy breach. A lot of you would probably say they don’t, and privacy is the most important thing here, but would you think any differently if you managed to come up with this technology on your own, or it was a company not named Google? I probably would.