Week 2 Creative Post

User Interfaces of AR Experiences

Inspired by a step by step guide to designing user interface for Virtual Reality(VR), I wanted to explore a similar design process for Augmented Reality(AR). The original article explored if we can standardize a set process to design VR UI the same way we design for fixed screen sizes. However, due to the breadth of mediums AR covers, it isn’t quite applicable. Instead I researched existing AR UIs, how people envision AR could be used, and categorized their designs.

While screen design happen mostly on flat surfaces of pixels, view area is usually the entire screen and they often have a fixed size range. VR mediums are also relatively uniform, mostly standalone headsets or mobile devices in headsets. Most come with UI in 3D spaces anchored toward the users and move with them. UI are usually used to control the experience, not part of the experience, so they are often right in front of the user, present themselves as 2D UIs with 3D perspectives.

Typical VR UI

Augmented Reality, with its broad definition, have many mediums. Smartphones, glasses, projection at public installation...

1. On screen augmentation with UI

First, we have the screen based ones, where augmentation happen over the live video feed from cameras with a single interactive object. Typical example included Snapchat and Instagram Filters, where users use on-screen UIs. This approach is applicable to most AR experiences happens through cameras, where interactions and observations are both confined to the screen and controlled though the screen.

2. Assistive graphic without control UI

Another application is ones that capture video and overlay information for assistance. Some need to be spatially aware, some can be pre-programed. These are information heavy and requires less user input.

Yelp imagined a future where you can look at business reviews on the street. They extended the existing UI into a 3D space.

3. Interact with virtual content on Screen

The third application is those that use real world space as a backdrop for specific contents. IKEA let you virtually tryout furnitures in your own space. Users interact with UI as well as the directly on the content.

Pokemon Go is another example where user fiddle with on-screen elements. UI elements are positioned at fixed places, agnostic of the augmented contents.

4. Real reality

All of the above are video feed based, reality augmented through visual layers added to a video. As you have seen, UI are mostly on screen and touch based. Another way to approach AR is the ones where you see reality with your own eye, through clear glasses or not. The most well-known example would be Google Glass. Due to the lack of physical surface to place UI, voice and gesture commands are usually in place.

Of all the UI approaches listed above, No.1 with on screen UI are the most fleshed out ones, interacted by millions daily. No. 3 is gaining momentum but requires slightly more develpment effort and have less desirable use cases. However, with Apple's support on the ARkit and human interface best practices, hopefully we can see more of that UI done right.