The Inquirer-Home

Google Glass hands-on review

We finally treat our eyes to some time with Google’s augmented reality specs
Wed Jul 24 2013, 17:42

IT'S BEEN MORE THAN A YEAR since Google first unwrapped its augmented reality eyewear concept, Glass. Since then, there's been a tonne of hype surrounding the device, and much of it hasn't been positive.

Nevertheless, that didn't mean we weren't eager to get our mitts on the highly anticipated spectacles when Google invited us for a demo today, with one of only two devices currently circulating in the UK.

Developed over the past three years, the project - which delivers a smartphone-like experience projected into your field of vision - is still considered by Google as a "concept", not so much an end user device, and the company is still improving it via developer feedback; a point worth keeping in mind when reading our testing of the eyewear for the first time.

The Google Glass display box sits on the top right hand side on the eyewear

First impressions
As we've followed the journey of Glass since its announcement in April last year - from the banning of the device in a Seattle bar, to its hacking vulnerabilities spotted by security experts just last week - we were more than keen to finally get to try it out. The only real impression we had of how Glass worked was from the early concept video released by Google when the project was first announced. 

But as we slotted Glass on to our face today, it didn't take long to realise that this video was not how the eyewear worked in reality. Putting them on for the first time, we were surprised to find that Glass' display was very limited in relation to our field of vision and not really that comfortable. The display is much smaller than we expected, sitting in the far right-hand corner of your field of vision.

This is what we considered the most annoying aspect in Glass when trying it out, as we had to look up towards the display which is integrated into the right-hand arm of the specs in order to read the information displayed to our eyes. You cannot look forward and see clearly what is being presented to you via Glass, as you would want to naturally, like the concept video portrayed. It feels unnatural to have to keep looking to the far right corner of your eyesight, which forces your left eye to look up as well and results in you looking white-eyed and zombie-like to anyone in your company.

This is what Google Glass looks like when slipping them on

Operation and performance
Glass is turned on by tilting your head up, or tapping on the side of its frame, where a touch sensitive sensor located on the right arm can detect and register the direction of your strokes. This sensor can also be used to swipe through menus that are displayed in your vision, meaning you don't have to speak to the spectacles all the time in order to operate them.

Google Glass runs using a heavily customised version of Google's Android operating system. It is meant to offer a very similar experience to using Android on a smartphone, and works alongside an app, which helps you control various features, such as more complex options for setting up how calendar reminders are presented to you via the headset.

Google Glass is linked up to an Android app

Once powered up, the first screen Glass presents you with is rather similar to a Google search engine home page, which has an "OK Glass" prompt under the logo. This activates Glass and sends you into the main menu, where you can choose between the options: "Google ... ", "take a picture", "record a video", "get directions to ... ", "send a message to ... ", "make a call to ... " or "hang out with ... ". This menu can also be accessed by tapping the side of Glass again.

In our testing, we searched for directions using voice, asking for directions to Oxford Circus station just around the corner. Glass worked almost flawlessly in this situation, recognising our regional accent and registering every word we gabbled at it, even when speaking rather quickly. It almost instantly found the location of the station we had asked it for, and displayed a small map of our position in the current environment, the same way the Google Maps app does on a smartphone.

However, what really impressed us here was the level of accuracy of the current location cursor, which shifted from left to right on the display as you turned your head from left to right while wearing the eyewear. This feature, we feel, is where Glass shows some real potential, as the task of navigating from one place to another via a smartphone is never an easy one.

 

Share this:

blog comments powered by Disqus
Advertisement
Subscribe to INQ newsletters

Sign up for INQbot – a weekly roundup of the best from the INQ

Advertisement
INQ Poll

Dead electronic devices to be banned on US-bound flights

Will the new rules banning uncharged devices be effective?