Google Glass And Thinking Differently About Technology

Posted by

Google’s Project Glass is a very interesting product.

At this week’s TED 2013 conference in Long Beach, I had the chance to wear and tinker with Google Project Glass (for photographic proof, you can follow me on Twitter and see the pictures). My expectations were not high and the product blew me away (big time). There has been a lot of online discussion about what it means to be wearing these pair of Internet-enabled glasses that made me think that this technology was not ready for prime time. That online discussion is wrong. The discourse has been around everything from the weight and feel of the glasses (they feel great and don’t weigh much at all… in fact, they were surprisingly light), to privacy (how do we know when people wearing the glasses are recording everything?), to more privacy issues (can Google see, capture and use everything our eyes now see?) to how we interact with one another (will we be paying attention to the people in front of us or the screen?).

What the experience of Google Glass is really like.

It is hard to tell from looking at the glasses and seeing the video demos, but your vision is not obstructed by the screen at all. In fact, if you’re looking at people or in your general field of vision, you don’t even see the screen. I know that this will surprise many people, but it’s true. To see the screen, you actually have to look up and to the right. Imagine trying to move your eyes as high and to the right as you can without moving your head. That’s where the screen resides. It’s also shaped and colored in a way that does not distract your general view in the least bit. I have been somewhat surprised by a lot of the online commentary, because I would not have thought that Google would create a device with a screen that annoys your regular, life experience, and these glasses do not. In fact, you don’t even notice the screen unless you actively force your eyes to see it. In short, it was an amazing and different experience. It was as fascinating as the first time I tried a mobile phone or put my hands on the screen of an iPhone.

What you quickly realize about Google glasses.

Google glasses is an intermediary technology. Nothing more. Nothing less. It is the first step in removing us human beings from having to hold on to a technology that must be manipulated by our fingers as we move to wearable technology that is manipulated with our eyes and our voice. That is the first big step. Beyond this movement away from having to carry a device in our pockets, it is the intermediary until we’re at the next big step, when any surface can project the information it shares, instead of having to wear the actual screen over our eyes. Once you wear the glasses and feel what it’s like to receive audio via bone conduction, speak commands and have it move through your choices seamlessly, it becomes immediately clear that the future is not in wearing glasses just because something has to house the screen, but the ability to just see the content anywhere and everywhere. A true heads up display for life.

The distraction of screens.

Some worry that having a screen situated over your top right eye will cause us to lose all human connection. That while someone is standing in front of us, we can be making eye contact while truly being distracted by whatever notifications the glasses are pumping through. The people who are saying this probably haven’t tried them on or worked with them long enough to realize that interacting with the screen at this point is just as rude as fumbling in your pocket for a phone or looking at everyone else around you while the person in front of you is speaking. Technology won’t make humans have better manners. Human beings are going to have to get better at managing their technology and not allowing the technology to manage them. This isn’t an issue specific to Google Glasses. People have allowed their devices to ping, ring and interrupt since those features were first made available. Google Glasses doesn’t aggravate that situation anymore than what we’re currently dealing with.

Everything you say…   

In my first book, Six Pixels of Separation, I wrote that in a social media world that is being increasingly co-opted by smartphones and mobile devices, everything that we say and do can now be used against us in the court of public opinion. That still rings true. Governments are moving beyond fixed cameras on streets to drones for surveillance and more. Citizens have moved from simple camera functionality to full-boar HD video that can be recorded and streamed live from an iPhone. I’m not sure why people are (somewhat) up in arms arms about Google Glass and what it’s recording. The easiest way to deal with privacy and more is this: if someone you’re talking to is wearing these glasses and you do not want the conversation documented, simply ask them to remove the glasses. What guarantees do we have that they’re not still recording it on their smartphone or that the person sitting nearby wearing the glasses isn’t recording your conversation as well? Who knows. I think Jeff Jarvis and his book, Public Parts, is on to something as we live in a world where everyone and anyone can report our lives in text, images, audio and video: We have to redefine what we consider "private."

Is it the next big thing?

There is no doubt that wearable technology and products like Google’s glasses are a peek into what the world will look like in the not-so-distant future. Still, it feels like an intermediary technology as it quickly evolves to a more interesting area: one where the screen can be anywhere and everywhere. So, if you think people who wear Google Glasses will be ignoring you as they stare at their screen, I would caution you to look at the current state of our world, where a circle of friends often includes some who are on their smartphones and not being active participants in the moment. While Apple races to deliver their first iteration of wearable technology, Google also proved something powerful with Google Glasses: they have quickly (and once again) become one of the most innovative and fascinating companies in the world.


  1. Mitch,
    I saw your post on Facebook earlier today and was hoping you’d write a blog post about your experience, certainly glad you did.
    Wearable technology is clearly the next up and coming step, with FitBit and other similar wearable tech, it seem alike we’re gearing up for this to be the next wave like you said as an intermediary step to interfaces on every surface.
    But I have to wonder if this is actually a step toward voice interface more than anything else, have we already moved beyond touch? It seemed like Apple was going in that direction with Siri.
    How was the interaction with it?
    Was it more voice commands or more eye movement commands? Or an equal split? Thus replacing touch with see.
    If it’s really just a new way to interface with technology (is it more than that?) then the utility comes down to the hands free aspect and the, for lack of a better word “Apps” or is apps too narrow?
    So does this mean you have to change your mantra now? Mobile, Touch and now See?
    By the way did you mean to include the text “Insert Google video here.” above the video in this post?

  2. Mitch,
    Thanks for recounting the personal experience. Like so many of these digital tools and gadgets, you need to live them to understand them… For example, the iphone and ipad, before people touched and “played” with them, were like that. The thought that came to me as I watched that video is that you can truly feel like you are in someone else’s shoes. Brilliant. I’d love to watch through Federer’s or Crosby’s eyes, to name but two.
    Can’t wait to get my eyes into a pair. As @Bobby says, it will be a whole new world of app development to accompany such a technology.

Comments are closed.