English |  Español |  Français |  Italiano |  Português |  Русский |  Shqip

Developing Glassware using the Mirror API

Google Glass Explorer Hardware

To understand Google Glass development, let’s start by taking a look at the hardware.

The Google Glass Explorer Edition is the first Glass device available to the general public. For many, it’s the first wearable device that a person will probably experience. Despite its futuristic appearance, the device itself has some pretty modest underpinnings.

While it does not look like a Android phone, Glass actually shares a SoC processor with the Samsung built Nexus S from 2010. Yes, the device of the future is powered by the same underpinnings as 3 year old Android phone. The similarities with a 3 year old device do not end with the processor. It even shares the same 512MB of RAM and 16GB of storage. The Explorer Edition runs a modified/stripped down version of Android 4.0.4 which runs well on that small memory footprint.

Of course, Glass is not packaged like a Nexus S. A tiny high density LCD projector, manufactured by HiMax, gives the Explorer Edition a sharp 640x360 display projected via a mirrored prism. While you may not have heard of HiMax, Google has already acquired a stake in the company.

Naturally, there’s no touchscreen, but Glass does feature a multi-touch sensor on the body of its frame. Aside from a camera button on the top of the device, it’s the only means of physical user input on the Explorer Edition.

Beyond the touch sensor, Glass incorporates a microphone as a critical part of the Glass user experience. Speech recognition and touch input are the only direct means of user interface with Glass. The camera and other sensors potentially filling the role of an indirect user input. While these mechanisms are powerful, developers may find these input methods limiting as they develop their applications.

The Explorer Edition incorporates a 5MP camera sensor that is capable of 720p video. Unlike smartphone cameras, it features a wide angle lens that helps Glass take some expansive pictures. Given the prominence of the camera in the marketing of Glass, it’s fortunate that Google has granted the device with a very capable sensor.

Instead of requiring earbuds or using a traditional speaker, Glass relies upon a Bone Conduction Transducer to deliver audio to the user. The technology is not new. Bone conduction has been used by the military and hearing impaired for decades. The integration of the technology into the Explorer Edition finally brings it to the masses. It’s worth noting that it is a functional compromise as it is not particularly loud and the frequency response will not impress audiophiles.

The sensors included in the Google Glass Explorer Edition are largely familiar to Android developers. An accelerometer is present along with a Gyroscope and compass. If you dig a little deeper, you will find a GPS that’s disabled on unrooted devices. There is one sensor on the Explorer Edition that you will not find on an Android smartphone -- a pupil tracker. While these array of sensors exists on the device, Google has yet to make them officially available to developers as of this writing.

While the Explorer Edition hardware includes Bluetooth 4.0/BLE support, the version of Android (4.0.4 SDK Level 15) does not include BLE support. Bluetooth is used to tether the device to a phone or other Internet access point. Glass also supports 802.11b/g WiFi.

For all of this functionality, it’s important to note that Glass is powered by a very modest 500mah battery. That is a mere third of the size of the 1500mah battery that powered the Nexus S. As a developer, the limitations of that battery will weigh heavily in your application development.

The only physical port on Glass is a standard micro USB. Like a smartphone, Glass can be charged through this port and pictures can be synchronized over the connection. With debug mode enabled, Android Debug Bridge (ADB) allows the device to appear like a traditional Android device to developers.

To make the most of this hardware, the Android underpinnings are practically misdirection. While the hardware is capable, Glass requires a development approach that is distinct from a traditional Android development device. 


There has been error in communication with Booktype server. Not sure right now where is the problem.

You should refresh this page.