Beacon Wars (?)

Continuing our iBeacon series, we're going to try to clear up any confusion around naming and standards - what are the differences (if any) between Bluetooth Beacons, iBeacons and AltBeacons? Will the original standard survive or will proprietary enhancements win out?

Bluetooth Low Energy (BLE) Beacons

Bluetooth Low Energy is part of the Bluetooth 4.0 spec that defines a number of profiles, describing what particular types of device should be able to do and how they should work. Bluetooth 4.0 has also been given the smart branding.

The devices that have come to be known as beacons (or iBeacons) implement the proximity profile (finalised in 2011) that lets other devices detect when they are nearby.

Importantly, this means beacon behaviour is well defined and described by a standard, so that devices and software from different companies can work together. So why does Apple need to certify products before they can be iBeacon branded?


iBeacon is a brand and trademark applied to bluetooth low energy beacon technology by Apple, who certify products as iBeacon compatible, but do not manufacture iBeacons themselves (yet). Companies can apply to carry the branding on their products and will be given the required specifications only under a Non-Disclosure Agreement.

The only technical difference appears to be that iBeacons should transmit an Apple-defined prefix, that would not be required to comply with the original BLE standard.

So, right now, there is no practical difference between iBeacons and non-Apple certified BLE beacons. However the fact that Apple has introduced an identifier means that they could potentially lock down iOS to only recognise iBeacon certified devices. Though this might seem unlikely, I believe it's inevitable they will try to differentiate the feature set of "iBeacons" from plain old BLE beacons in some way in the near future. It also means that you won't be seeing any Android software or hardware branded as iBeacon anytime soon (more on this in the next section).

Finally, it's worth noting that Apple have been granted a number of patents around iBeacon technology (with more likely to emerge), and have made filings relating to an Apple iBeacon device.

Whether Apple's iBeacon certification continues to be compatible with the open standards, or if they will see enough commercial opportunity in differentiation remains to be seen.


One side effect of Apple's certification requirements was that companies had to pull or rename Android software they may have been attaching the iBeacon label to. This led to Radius networks dropping their iBeacon Software Development Kit for Android.

They themselves brand their own beacons RadBeacon, however they are in fact Apple certified as iBeacon compliant.


Perhaps as a result of the above, Radius Networks seemed to sense a move towards proprietary customisation and away from Android, and introduced the AltBeacon open standard (more) as an open alternative to iBeacon.

Because there is no open and interoperable specification for proximity beacons, Radius Networks has authored the AltBeacon specification as a proposal for how to solve this problem.

This specification is simple and compliant with the original Bluetooth 4.0 standard, using data originally defined as Manufacturer Specific Advertising Data to squeeze a bit more into the messages transmitted by by AltBeacons.

Since the standard is open, manufacturers can build products that work with each other, but there is also scope for them to add proprietary data that could (for example) lock their own hardware to their own software, or be used to provide additional features.

It seems unlikely that AltBeacon will be widely adopted as it's attracting little attention, however it could gain traction if Apple attempts to significantly diverge from Bluetooth standards with iBeacon.

What About Everyone Else?

Beacon manufacturers generally seem to be obtaining iBeacon certification whilst continuing to tout the cross-platform nature of their products.

  • - "Works with iBeacon, Android, and Bluetooth Smart"
  • Qualcomm/Gimbal - "APIs for both iOS and Android"
  • Estimote - "Full iOS and Android compatibility"

This is good news, with suppliers adding value via their software offerings, end-to-end solutions and product ecosystems rather than attempting to hijack the standard.

What Does This All Mean?

Right now, iBeacon is an exercise in branding, with hardware and software from major manufacturers able to work together using the original Bluetooth standard for proximity. Further, the biggest players seem to be promoting this fact rather than trying to diverge in order to stand out.

However, it's worth remembering that Apple devices dominate others in the potential audience for beacon-related products and services. When we wrote about this before an estimated 90% of iPhone owners vs 18% of Android owners could use beacon apps.

Currently those figures are around 45% for Android and 96% for iOS, and with Apple leading deployments, and better integrating the experience of using beacon-enabled apps in its operating system, they may have the leverage to dictate future developments.

The Future of iBeacons

This is the concluding article in a series of articles on iBeacons (AKA Bluetooth Low Energy beacons), it discusses imminent changes to the way iBeacons work and presents some ideas on how things might develop in future.

iBeacons Today

Our last article covered the range of iBeacon applications that exist today in various industries. In the last year or so, the pace of deployment has been rapid and, more recently, there have been some interesting developments in the industry and the technology itself.

In July, received $2M investment from Sunstone capital, with venture capitalist Max Niederhofer of the company saying

Bluetooth Low Energy and iBeacon are the building blocks of the next wave of computing...

And, with a recent report predicting there will be a 60 million unit market in 2019, widespread adoption looks increasingly likely. Hardware costs per unit are also decreasing.

Whilst not building directly on iBeacon technology, Apple's recently announced mobile payments system is already going hand in hand with iBeacon deployments and this makes perfect sense given Apple's support for the platform, and the fact that it extends the retailers' connection with their customers from the point of sale to their whole shopping experience.

What is Changing

In a previous article we explained that (at least) 90% of iPhone users are able to use iBeacon, a significantly higher figure than for Android users (around 18%), and Apple have led the way in making subtle changes to remove potential barriers to iBeacon app usage.

This hints at more operating system level support, and one could imagine actions being triggered in a more automated way than at present. iOS 8 already prompts users about iBeacon apps and this could be extended such that beacons could act as operating system-level triggers.

The need for a dedicated app is currently a barrier to entry for smaller outfits - could the operating system and web browser combine to remove the need for a dedicated app? This is a little like web notifications that are implemented by a number of browsers, though notably none of the mobile browsers yet. As an example - eBay can alert me via my installed desktop browser (but outside of any web page) about items I'm watching that are ending, on mobile this could remove the need for a dedicated app to receive such notifications and we could imagine something similar for iBeacon-related alerts and content.

It may also be significant that Apple has changed the way their devices connect to wifi, meaning that they can no longer be identified and tracked in that way. This means that beacons are the only way to physically track users at a location.

New Types of Beacon

Will beacons themselves change? Current indications are that they will certainly evolve.

Estimote recently announced stickers that function as iBeacons, with additional temperature and motion sensors enabling new applications. It's intended that they can be attached to moving objects that could then be tracked as they passed monitoring stations, or whilst they're in an app user's vicinity (like packages, or children for example!) are releasing cloud beacons including wifi connectivity. This allows these beacons to transmit analytics data and provide a gateway for managing other regular beacons in their vicinity. However, wifi requires power and they must be plugged in to a power outlet, or at least recharged regularly.

These developments maintain backwards compatibility but it's likely the strain between adding functionality and maintaining standards compliance will likely increase as vendors try to differentiate themselves in a market where hardware costs per unit are rapidly heading towards negligible.

Users Become Moving Beacons

Apple devices can already act as iBeacons themselves - transmitting information just like a mobile beacon. Few apps have built on this yet, but it means that app users can be constantly announcing their presence and (masked) identity.

Mingleton uses this feature for location-based personal connections, and many such applications for conferencing and events can be imagined.

A New Network

Venture capitalist Max Niederhofer further commented on his company's investment in

...I see this as an infrastructure build-out play, where Beacons are the routers and pipes of a new network infrastructure on which we’ll see some very interesting applications.

...From a VC perspective, this is a bit like Cisco in the late 80s. We’re building the hardware and software that is the backbone of the new network.

This may seem like exaggeration, but given

  • large scale deployment of geo-located iBeacons and others attached to known things,
  • a large base of beacon-enabled apps, and
  • the ability for users of those apps to transmit as well as receive beacon data,

it's certainly tempting to consider a new type of global network, working in conjunction with those that exist today.

This could be viewed as a geographical layer for the web, enhancing context with location; and extends the Internet of Things beyond powered devices requiring wired or wifi internet connections.

Beacon manufacturer Radius Networks has an app that allows users to submit beacon locations, and over 150,000 such locations have been mapped, giving an indication of geographical distribution of beacon deployments today.

Beacon Services

With the deployment of a new type of infrastructure, there will be a rise in companies providing complementary services. For example:

  • Beacon battery and outage monitoring, along with replacement.
  • Internal mapping and beacon locating (using devices like Google's project tango tablet).
  • Optimisation of beacon networks.
  • Cataloging of located beacon identifiers (for use by competitors, for example).

Real-World Analytics

It's already possible to track visitors to beacon-enabled places to provide data on routes taken and so on. This could let retailers optimise layouts for sales, galleries identify popular as well as less-visited exhibits and malls identify shopping patterns.

This is analogous to the situation when website owners had only their web logs for analysis. It might have seemed crazy then to suggest that a company could secure a presence on the majority of websites, such that they could track visitors between them - but that's exactly what Google did by offering analytics (in combination with other products). An even crazier suggestion might have been offering your potential competitors space on your website, but that's now common via online advertising (in simplistic terms).

Another analogy from the web would be the move from company- or niche-specific forums and communities to mass adoption of a few social media sites as the places to share information and discuss common interests, and even for companies to manage their customer relations.

Similar developments are likely in geographical analytics - if a compelling enough service is offered, companies may give up data on their own beacon networks and physical customer movements such that they can gain wider intelligence than they would otherwise be able to. This would come from the aggregation of data about many beacon installations.

Taking this to an unlikely conclusion - imagine a retailer being able to target promotions at competitors' customers whilst at known locations in their stores. Or a brand being able to target customers whilst viewing their products at a range of retailers.

It could be that sectors other than retail (where commercial sensitivity is less of an issue), or collection of only anonymised data will provide greater opportunities for such aggregated applications.

Beyond Single Apps to Meta-Apps

Currently, specific companies or organisations create their own beacon installations, and integrate beacon-enabled features into their own mobile apps. however, this is not scalable to take full advantage of the potential of the new network described above - smaller stores, venues and other locations will not have their own mobile apps, and even if they did, consumers would be deluged with niche location-aware apps.

As touched on above, the concept of aggregation is so powerful that what we will term meta-apps are going to be in the strongest position to give most convenience and utility for users, and the most valuable data for businesses. The term arises from comparisons with (for example) meta search engines such as comparison websites for financial products, travel and so on. These take data from many individual websites to provide an aggregated service and data - offering consumers the best deals, and businesses new sales channels and data.

This can already been seen in a limited way amongst the many shopping malls who have implemented iBeacons. They can potentially provide better services to visitors across all of their tenant retailers in a single app. There have also been a number of local applications, aggregating the retailers in a particular geographic area (the Brixton pound app for example).


We've explained that the highly competitive nature of retail doesn't necessarily prevent meta-apps emerging and there are already signs of this trend developing.

InMarket’s Mobile to Mortar product claims that it can reach 40M shoppers via Earth's Largest iBeacon Network across locations in Los Angeles, San Francisco and Seattle. InMarket doesn't have its own consumer app, but has arrangements with other app owners to embed the technology and thereby provide a conduit to consumers (such as CheckPoints and Epicurious).

Other companies pursuing similar tactics include ShopkickVente Privee in France and Appflare in the UK, targeting convenience stores.

Other meta-type companies in retail such as QuidCo would be well placed to use iBeacon networks (they already offer some location-based features via GPS).

Other Industries

It's easy to envisage similar meta-style apps for other industries. For example:

  • Services for travellers taking advantage of iBeacon installations across airports and other transit locations.
  • With networks of beacons in restaurants, bars and entertainment venues, new types of app for socialising are enabled, providing joined-up data and services for venues and patrons alike.
  • Beacon-enabled sports venues could facilitate new meta apps for sports fans, who travel to different team venues.
  • An app for iBeacon-enabled museums, galleries and other cultural venues could provide personalised tours, and use your visiting and viewing habits to suggest new exhibitions and events, as well as providing footfall data.

New Types of Application

The applications described above are of a familiar type, taken to their logical meta conclusions. however, completely new types of application could also be possible.

For example, in the same way that Google's mapping efforts enabled revolutionary applications, aggregated information about iBeacon networks on a large scale could facilitate assistants for the visually impaired, providing guidance at any iBeacon-enabled location and internal navigation - providing route finding in unfamiliar (iBeacon-enabled) buildings.


We've tried to give a sense of where iBeacon is heading in the near future, and some thoughts on how the technology and its applications might develop in the longer term.

The pace of deployment, and perceived value to consumers of the apps that emerge, will be crucial over the next year or so in securing widespread adoption for iBeacon applications. Businesses must prioritise service and utility for users in the short term, if they are to realise the potential gains of a new location-aware channel and the data it generates going forward.

If you'd like to discuss an iBeacon application, get in touch!

iBeacon Applications

This is the third in a series of articles on iBeacons (or Bluetooth Low Energy Beacons), describing the current application landscape, by industry sector.

If you need to get up to speed first, check out our introduction to iBeacons, or practical guide.

Real World Applications

Though iBeacon is still a relatively new technology, there have been a number of high profile deployments. We've selected some of these to give a flavour of the different types of application being implemented with iBeacons today.

Many of these share common features and we've tried to highlight unusual or interesting ones. It's clear that many organisations are in a trial phase to establish which features are going to prove compelling and engage customers, whilst not proving too intrusive.

The end goal might be that consumers of all kinds embrace location-aware apps and messaging, so that companies can reap the benefits of the data generated as a side effect. However, right now the focus seems to be on providing or enhancing service and thereby promoting adoption.

It's important to note that these examples all follow a pattern of being specific to a company or organisation, with the requirement that users have installed that organisation's mobile app.


Retail is currently being seen as the major use case for beacons, and it's easy to see why. In practical terms, navigation can be problematic in large stores and shopping malls - providing location-aware assistance is an obvious application.

However, paired with loyalty card data, stores would be able to push offers, suggestions and reminders targeted not only by location, but also by purchase history and buying patterns.

Taking this full circle, if retail venues can successfully promote adoption of such location-aware apps, they stand to gain huge volumes of new data on consumers' in-store habits that can be used to optimise layouts and future promotions.


Apple was a pioneer in this area in 2013, deploying iBeacons in 254 US stores. These worth with the Apple Store app and the focus is on providing customer service - for example, providing additional product info and notifications about Genius Bar appointments or repairs ready for pick up. Apple have said that they do not collect data about customers' movements.

They are currently updating the beacons in their stores - a move that is likely tied to a mobile payments initiative and the release of the iPhone 6.

More Retail

Food and Drink

iBeacons could guide patrons to their seat, flag that they are seated, offer mobile ordering and payment, and work in conjunction with existing online management systems to streamline processes and enhance data on customer activities.


Arts and Culture


And More


Clearly there are already many applications being built using iBeacons in a range of industries and contexts. However this is only the beginning, and it should also be obvious that there are countless opportunities in diverse areas like education, conferences, accessibility, logistics, people management, healthcare, gaming and public services.

There is evidence that Apple is continuing to support and further the technology, and we'll be looking out for hints amongst the announcements at Apple's iPhone 6 event tomorrow!

In our next post, we're going to take a look at the future of iBeacon...

(i)Beacons in Practice

This is the second in a series of articles on iBeacon (or Bluetooth Low Energy Beacons) - explaining how they work in practice and discussing current and future iOS and Android support.

If you need to get up to speed first, check out our introduction to iBeacons.


In the first article, we explained how bluetooth beacons (or iBeacons) constantly transmit unique identifiers, and how those identifiers can be used by mobile apps to retrieve content relevant to the location of the beacon.

How iBeacons Are Put to Work

iBeacon hardware manufacturers generally add value by providing software and services to help customers build out a complete system. For example:

  • Content Management System: Used to assign specific content (text, images, URLs, video etc) to certain beacons.
  • SDK: Vendor-supplied software embedded in mobile apps to handle recognition of beacons and automatic retrieval of their associated content.
  • API: Some vendors provide open interfaces to their systems to allow you to create more customised solutions. For example, you could create a web-based app that works in conjunction with your mobile app to shows details of beacons you've visited.
  • Demo and/or management apps: Vendors may provide mobile applications to easily demonstrate the above - retrieving and displaying a beacon's content when it is in range for example. They may also provide apps that let you adjust your beacons' settings.

These components work in conjunction with your own apps and back-end systems (CRM for example) to provide complete contextual solutions, as illustrated below.

Doing Your Own Thing

As explained in the previous article, beacons implement parts of the bluetooth standard that is supported by the underlying operating system (Android or iOS for example). This means that you don't need to rely on vendor-supplied software to create apps that use beacons, and also that recognition doesn't rely on matching beacons with their manufacturer's own software.

This means you have flexibility in creating your own solutions. For example, use Phonegap with a beacon plugin to create a cross-platform mobile app that accesses your beacon vendor's API to retrieve content. Or leave your vendor's software behind altogether and do whatever you like when your beacons are recognised - for example accessing a product catalog or location map.

Importantly, although not originally intended, this means that it's possible for anyone to write an app that recognises any beacon - beacon scanners can be downloaded from mobile app stores.

Platform Support

Considering how current apps display notifications, it's obvious that you don't need to actively open them in order to receive updates - these can be shown on every new event (like email), or more occasional and perhaps with user control of frequency (like Facebook and Twitter).

Beacon apps must work in the same way, and could take things even further. However there are a few key issues that need solutions to provide a frictionless experience.

  • Bluetooth must be turned on.
  • Apps should be able to notify about beacons when running in the background.
  • What if the app is not running at all, or not even installed? Can the OS prompt about beacons?


Beacon apps can run on Android devices that support Bluetooth 4.0 running at least Android 4.4 (Kit Kat). Currently, 18% of Android devices run Kit Kat - these are likely to be newer devices that also support Bluetooth 4.0, though some may be older devices that have been upgraded to Kit Kat. Whether Bluetooth is on by default will depend on your device and its flavour of Android.

Apps can recognise beacons when running in the background, however this could be dependent on your vendor SDK's support if you are using it.

The forthcoming Android version (codenamed L) offers improved support including better battery life, ability for devices to act as beacons, recognition whilst in standby mode and more. This bodes well for the future, but Apple have so far led the way in platform support.


Apple invented the term iBeacon and have managed to promote the technology whilst (so far) maintaining standards across other platforms. The iPhone 4s and later (running iOS 7) are capable of running beacon apps, and this accounts for 90% of iOS users. iOS devices are also capable of acting as transmitters, which may create interesting new classes of apps that build on iBeacon standards, without necessarily using iBeacon hardware.

The release of iOS 7.1 also brought improvements and hints for the future. Apps can not only recognise beacons when in the background, iOS will recognise an app's beacons are in range when it is not running at all, and notify the user via the lock screen.

Looking forward, iOS 8 may take this further - using unique beacon IDs to direct users to download the appropriate app, even if they have never had it installed. Full details aren't known but it's clear this would mean submitting data about your beacons to the App Store along with your iOS app.

It's also worth noting that Bluetooth is now on by default on iOS devices - in fact upgrading to 7.1.2 re-enabled Bluetooth if the user had turned it off.

Finally, there is reason to believe that Apple is planning it's own iBeacon hardware.

Location in Practice

Where does location come into it? Beacons are widely regarded as devices that provide micro location - more accurate internal location than GPS or wifi. However this can be misleading - beacons have no intrinsic knowledge of their location, this information must be added by you - when beacons are placed inside a building, you will note the exact location and this becomes part of the contextual data that can be retrieved when the beacon is in range (though it could be hard-wired in your app if it's not likely to change).

Given precise locations of three nearby beacons, it is possible to calculate a precise position using trilateration.

Context is King?

Beacons therefore add a layer of context. Previously, context was given by a user's details and history, combined with factors like time and possibly approximate location from GPS (think of Google maps and other local services). Now that can be enhanced by precise, immediate location and therefore movement, for example - entering the building, at a specific car in the showroom, at a particular gate in the airport, in a certain section of stadium seating, at some museum exhibit, walking towards a certain store and so on.

This creates opportunities for new types of app providing information, utility and convenience for users that wasn't previously possible. However it also generates significant quantities of data on movements and habits of consumers - this is a potential boon for companies, but is also likely to lead to privacy concerns, even though some companies already track location by GPS and wifi with consumer opt-in.

This means that the applications that emerge in the near future, and how they handle the balance of providing consumer utility with data collection, are going to be of crucial importance in promoting adoption of the technology. Supermarket (and other) loyalty cards show that it's possible to get this right - they are widely embraced, probably because the benefits to the consumer are clear and the proposition is simple.

In the next article we'll provide a comprehensive overview of the current landscape of beacon applications and find out if they're winning over consumers.

Mobile 3D Remote Control

Following our articles on mobile device orientation and using it to mimic the dynamic perspective feature on the Amazon Fire Phone, we're combining device orientation with remote control. And there's a demo with a Bugatti Veyron in it!

Device Orientation Recap

In previous articles, we explained how mobile devices track their orientation in three dimensions, and how this is available to use in web apps via events that are triggered when the device tilts and moves.

We also showed how this could be used to look around three dimensional objects on screen, just like the dynamic perspective feature of Amazon's Fire Phone.

In this article we're going to look at sending this information from the mobile browser to be used for remote control.

Remote Control with Node.js and Websockets

What do we mean by remote control? Imagine a full screen web browser running on your TV, and replace your TV remote with your mobile phone. This is the second screen experience of apps like TVTag (formerly GetGlue) and Beamly (formerly ZeeBox) with an added layer of direct interactivity and control.

There a number of ways to achieve this - Samsung, for example, are working on consumer devices that communicate directly on local networks. However we're aiming for something that you can interact with using your internet-connected device and its web browser, with no special apps or setup required.

This is where node.js and websockets come in. Put simply, a websocket is a bi-directional channel between a browser page and remote server - it's lightweight and low latency, you simply send messages over it without the need for handshakes, protocols, HTTP requests or responses. They haven't been widely supported by browsers until relatively recently.

Node.js has excellent websockets support via the library, and handles the server side of the connection. In our case, this just involves managing the pairing and interactions of a controlling browser with a viewer (that will be controlled).

An Application

It occurred to us that manipulation of 3D models using two dimensional controllers like touch screens and keyboards was tricky and unnatural, and perhaps - by virtue of its orientation in three dimensions - a mobile device could act as a better, three dimensional controller for a 3D model on screen.

We'll talk more about potential applications in a future post, but with increasingly sophisticated technology making its way into advertising and in-store displays, one can imagine interacting with a digital ad to explore a product and its customisation options. This is much more powerful than (say) a video loop, and has the side effect of generating potentially valuable marketing data and interactions (more on that later).

How it Works

There is a passive viewer page, accessed via a generic URL. On the server side, each new request for the viewer page generates a unique ID, and the viewer page displays a unique controller URL incorporating the ID. A QR code is generated for the URL for convenience, since it's expected to be accessed on a mobile device.

When a request is received for the controller URL, the server establishes a connection with the associated viewer page. Thereafter, control messages are simply relayed from the controller to the viewer, with the interpretation of those messages handled in Javascript by the viewer.

The Prototype

We decided on a car showroom prototype, since cars generally have a wide range of options for customisation, and these will mostly not be available for physical viewing in a showroom.

We chose three.js to handle the display and manipulation of a 3D model (note this requires WebGL), and tracked down a Bugatti Veyron model to work with. This lets us demonstrate the following.

  • Manipulation of the model in three dimensions using device orientation.
  • Zooming the model view with pinch gestures.
  • Customisation of the model (in this case, choosing a colour).
  • Generating other interactions (such as social sharing, enquiries and so on).

A fallback touch interface was implemented in case device orientation events are not available.


Check out the viewer interface and start exploring the prototype.


As mentioned in our previous post, device orientation events are not always exposed reliably by the browser, and the device sensors can behave erratically at certain boundaries. For this reason, left-to-right and forward-backward rotation in the prototype are limited close to -90 and 90°. If the model seems to behave oddly, use our utility to check what values your device/browser is reporting.

The rotation of the device means the web browser will tend to flip between portrait and landscape mode. This necessitated a responsive design for the controller.

Taking it Further

Clearly, both controller and viewer could be more sophisticated but the real opportunity could be in the marketing data generated as a side effect of interactions.

For example...

  • Allowing users to save and share particular configurations, building a database of popular options and combinations.
  • Connect with social accounts on-device to add demographic information.
  • Follow up with personalised communication based on users' activities.
  • Measurability compared to non-interactive content.

So we can see that, as well as providing a novel means of control and interaction, applications of this type provide companies and marketers a brief presence on a consumer's device, that can be built upon to create a longer-term conversation. After all, the controller is likely to be the page the user sees when they next open their browser - the content and message could be changed after an extended period of inactivity.

We'll explore other applications in the next post, in the meantime - let us know what you think!

Dynamic Perspective

Amazon launched their first foray into the Android handset market, the Fire Phone in June 2014. It incorporates unique head tracking that lets users look around 3D objects on screen. But how close can we get with regular device orientation?

Amazon's Fire Phone

The Fire Phone uses 4 additional cameras to achieve its innovative head tracking feature, however we noticed that in all the demos we saw, users moved the device itself, not their head.

Whilst this feature could be a precursor of something more useful on larger format devices (that you might walk around), it got us thinking how well we could mimic it for mobile in HTML5 using the deviceorientation event.

Device Orientation

Modern smartphones incorporate a gyroscope, accelerometer and compass that report the device's orientation, movement and direction. You may have seen games controlled by tilting your phone, but these features are not widely used outisde of navigation and rotating the display when the phone is turned on its side.

This surprising given that device orientation and direction are available to HTML5 web apps in the browser.

Technical details are explored in another post, we will focus on a few prototype applications here.


The Fire Phone demonstrations showed users looking around 3D objects, and tilting to scroll. The latter has been available on other Android phones for some time, and is arguably tricky to use in practice.

The prototypes make use of the left-right and forward-back tilt of the phone. You can use this handy page to check that your phone reports this information, and that your web browser makes it available. If you don't see values updating on that page as you move your phone, device orientation is not being reported to the browser and the demos won't work... If that's the case, you might want to try a different browser on your phone.

Looking Around 3D Objects

Whilst the Fire Phone detects the viewer's head position to change the rotation of objects on screen, we will achieve a similar result using the orientation of the device. The idea is to rotate the object on screen in the same direction, such that tilting your phone lets you see more of the object in a natural way.

Check it out in action on this page (a QR code will be shown on non-mobile browsers for convenience).

The box was created and rotated using CSS3 3D transforms.

We also created another demo using these fluffy little clouds, also created with CSS3. You can tilt your device to look around the clouds.

Orientation Correction

When creating the dynamic perspective prototypes, we wondered what would happen if you rotated the object in the opposite direction, such that it appeared to maintain it's original position even when you tilted your device.

We've applied this technique to a small example web page for the purposes of illustration. This also shows the power of 3D transforms in CSS3 - rotation is applied to an element (here, the) and inherited by everything contained within as you'd expect.


Device orientation and motion (via the deviceorientation and devicemotion events) are little-used but increasingly relevant in web development for a mobile-first world. There are relatively few real-world applications, but great potential for gestural input that doesn't require your free hand.

We've shown simple examples inspired by the Amazon Fire Phone demos that used multiple cameras to achieve similar results. Our next demo will show a practical application - remote mobile control of a more sophisticated model, rendered with 3D Javascript library three.js.

Mobile Device Orientation

The deviceorientation and devicemotion events have been supported by the default iOS and Android browser for some time, and are now supported on the other major mobile browsers too. This creates opportunities for new types of interaction and this post acts as an introduction to those getting started with mobile orientation events, including some practical experience.

What's Available

Most mobile devices incorporate a gyroscope, accelerometer and compass that, between them, can report the orientation and movement of the device in three dimensions.

The deviceorientation event returns an object with three properties: alpha, beta and gamma, corresponding to the device's rotation in degrees around the Z, X and Y axes respectively.

alpha is reported by the compass, for example 90° = west.

A device lying flat should report beta and gamma values of zero.

From this position, tilting the device's screen towards the right increases gamma towards 90° when it is fully pointing right. Tilting left decreases gamma towards to -90° when it is fully pointing left.

Similarly, tilting the device forward towards you increases beta towards 90° (when the screen is vertical and facing you), tilting it away decreases beta towards -90° (when it is upside down and facing away).

The devicemotion event returns information about the device's acceleration and rotation rate in x, y and z.

Device and Browser support

Gyroscopes, compasses and accelerometers are standard in modern mobile devices and, in principal, the deviceorientation event is well supported by current mobile browsers.

In practice though, we have found wild variation in the values returned to the browser by different device-browser combinations.

What's Really Available

The w3c documentation on these events is fairly prescriptive, but leaves a few loopholes for browser implementations. For example:

The event should fire whenever a significant change in orientation occurs. The definition of a significant change in this context is left to the implementation.

Although a threshhold of 1 degree is recommended, tests in Chrome on an HTC One M8 running Android 4.4.2 currently only seem to return alpha and gamma to a granularity of 90 degrees, and return NaN (Not a Number) for beta! Testing in the default Android browser on the same device gave expected values, however they appear to update only once per second.

Our initial testing was with a range of Samsung Galaxy devices using Chrome - these gave values as expected and updated values very frequently.

The exception to all of this is the alpha value reported by the compass - on most devices we tested it seems to be unreliable and erratic, though we found it reported reliably by a Galaxy S3 mini.

You can test your device-browser combo using our handy checker (a QR code link is shown for convenience if you view on a non-mobile browser).


You might expect that rotating your device fully around a single axis would produce values increasing smoothly from 0 to 360° then wrapping round. However, this is not the case:

  • beta has a range of -180° to 180°
  • gamma has a range of only -90° to 90°

Additionally, for example, tilting your device towards upright vertical increases beta, but as it approaches 90°, gamma can vary wildly across its whole range. This is partly due to ambiguity in the coordinate system, since certain orientations can be achieved by different sets of rotations, but the device and browser can introduce their own issues.

There is a bug logged for Chromium that highlights all of this quite well.

This is related to the phenomenon of Gimbal lock and the w3c documentation explains mapping the values to avoid such issues (for example using Quaternions).


Device orientation can supplement other means of input in an increasingly mobile-first world, and is not just a gimmick - consider that it offers various degrees of control without the use of your free hand or other external input.

However, variability in implementation on specific devices and browsers means that it can't be relied on as the sole way of controlling some interaction. For example, we're working on a prototype for mobile remote control where arrow buttons are displayed for control when device orientation is not being reported.

A post on that application is imminent, for now you can also read about mimicking the Amazon Fire Phone's dynamic perspective feature using device orientation.