When you and your team are developing your first VR application, it’s hard not to draw attention to yourselves at work. Your colleagues are sat at their desks, and you’re spending the day spinning around, punching the air, crouching under tables and attracting looks of curiosity, envy or outright confusion. There’s an undeniable intrigue to your work and you’ll often be interrupted by people asking for a go.

Virtual reality is in the news everywhere this year and there are a lot of talented people in the software, gaming and film industries finding out what it can do. Studios are investing heavily in the new wave of VR products, and major players like Sony and Facebook are offering us systems we can buy and use at home.

Whilst that’s really exciting, the majority of applications we hear about are focused on entertainment like games and film. Our team is more interested in how VR might be used in the workplace, and we’ve been experimenting with ways it could help engineering teams collaborate remotely on specialist projects. Whilst we don’t see VR completely replacing our keyboards and monitors, we do see a close future where we use it alongside them at work for certain tasks. We started a project to prove we could build an intuitive, practical and engaging work experience for engineering colleagues at DNV GL.

vive
HTC Vive: Headset, controllers and room sensors (Image source: http://bit.ly/2dwHEnc).

We’ve been developing using 3D game engine Unity for the HTC Vive which is at present, one of the more sophisticated hardware choices. VR hardware ranges from phone-based systems like Samsung Gear VR and Google Cardboard, to seated experiences like Oculus Rift,to the Vive which enables you to roam around a small space using tracking sensors. This allows us to walk, crouch and jump around the room as we would in real life and notifies us gently when we get close to a wall. As our application would involve a great deal of movement around a large environment, we felt the Vive would best suit our needs.

Whilst we began with user research and story mapping I tried to swot up on UX for VR, but found surprisingly little online by way of practical advice or best practice. What’s the best way to navigate a space, for example, or build a settings menu? Setting out to discover for ourselves, our team has learned a great deal over the past few months through repeated experimentation, prototyping, testing and more testing. If only I’d known then what I know now…

teamvr

1. Much of the established wisdom goes out the window

The most exciting thing for a UX designer on his/her first VR project is the jump from a 2D screen to an immersive, 3D environment. This means that input devices like touchscreens and mice are gone, and replaced by a new wave of controllers that fundamentally change our interactions with the environment, requiring us to overturn many longstanding ideas about good UX design which don’t apply once the screen is gone. The medium being so new, we’ve yet to establish new design standards, and are seeing all manner of unusual and innovative ideas emerge. It’s a great time to be working in VR but be prepared to throw away a lot of what you know.

In addition, the VR design and development process is also new territory. Our team has plenty experience with Agile and Lean UX methods, but some of the tools and techniques we’d grown used to needed to be changed here, too. More on that later.

2. Motion sickness happens for some users, but you can design it out.

You may have heard that people get motion sickness using VR, and it’s true, but in our experience it’s caused more by bad design decisions or poor performance than anything else. This is probably the first time as a designer you’re able to make people throw up or fall over using your application – assuming you’re not a practical joker, you’ll want to make them as comfortable as possible.

During out early sprints, we found dizziness to be an occasional problem for some users, though not all, with most experienced users proving the most hardy. Valve suggests keeping frame rates above 90fps and sure enough, when graphics got demanding we noticed a disturbing visual juddering effect. Whilst optimising design for performance is standard for any project, in VR the consequences of low frame rates are more severe as they can cause physical discomfort.

teleport
Our teleport feature for movement around large spaces.

Whilst the Vive allows natural, short-range movement, going beyond a few metres in the VR environment means designing a short-range travel system – a teleport. Inspired by Valve’s The Lab, we built a means of projecting an arc of light from your controller to point at where you want to go, and tapping a controller button to travel. The arc is a visual indicator of what will happen next, and is needed to accurately ‘target’ your destination.

When moving, we initially had the system ‘shove’ users across the room to their destination. The effect was much like jumping on /off an airport travelator and several people – including me – stumbled and felt momentary dizziness. Next we tried adding acceleration and deceleration and slowing travel speed, which was fine for experienced users but with our target audience of new adopters, we needed to minimise any chance of discomfort. We learned that involuntary motion – translating or rotating the camera or environment – could induce discomfort and we therefore settled on a ‘blink’ animation which teleports users to their destination with a quick fade in/out to black instead.  This decision involved a trade-off between seamless motion, which keeps you oriented in space but can be uncomfortable, and the ‘blink’ which is effortless but requires users to re-orient themselves after each jump.

soundstagex
Music generation UI in Soundstage, composed of connected 3D objects.

3. Intuitive VR controls are physical and mimic the real world

Many of the standard UI controls we’re used to seeing on 2D screens suddenly seem strange in VR. When the user is immersed in when feels like a real, physical environment, the usual flat buttons and panels feel out of place, even jarring when overlaid on to the world.

In VR, natural-feeling controls are physical again – buttons depress and audibly click, switches flip and grouped objects really are plugged together. In UX we refer to ‘affordance’ -meaning an object’s characteristics intuitively imply its functionality. Sometimes this means a UI control is ‘skeumorphic’ meaning it’s riffing on a real world object, but in recent years the trend is for visually simpler, abstract controls. Ironically, we found VR controls most successful when they’re skeumorphic again – we understand these physical controls because we’ve used similar in reality, and their presence makes sense in the 3D environment.

objects
Prototype 3D interface objects for playing back image and audio files.

Our team’s application allows users to project image files on to the walls of their environment. We represent these image files as solid, projector-like devices which can be picked up and used much like a real product. Audio files, similarly, are represented by speaker-like cubes. On/off controls exist as buttons on the devices themselves, and metadata or context menus can be similarly attached.

VR music generation app Soundstage uses this concept very well, representing your musical controllers as mixing desks and drum machines with buttons and dials on the devices themselves.

When our users group several files together, we extend the physical metaphor and put them inside a container together – one that can be exploded on demand to reveal the objects inside – or linking them via visible strings.

4. Wayfinding works best as part of the environment

Our application required that we help users navigate around a large environment with little in the way of distinctive landmarks. Our teleporter  by now was working well, but people would still needed help to know where they were and this meant labelling rooms and key structural details for orientation. Our first attempts at wayfinding mimicked familiar solutions like satnavs and gaming HUDs – overlaying flat, 2D arrows on to our view to follow to the next waypoints, or drawing a line to follow. However testing revealed these felt strange on top of an otherwise realistic environment, and users in some cases felt they obscured their view, rather than helping.

signage
Signage built into the environment.

Instead, we tried a different approach and adopted a signage system similar to what one might find in airports and hospitals. Using only the environment, we attached clear, high-contrast signs on to the walls and floors of our VR world and combined this with a map users can call at any time. The signs needed to be easily noticeable and placed at comfortable viewing heights so people wouldn’t experience discomfort by regularly craning their neck to look up.

Despite its simplicity, signage turned out to be a more intuitive solution that felt like part of the environment, rather than something overlaid on top, and has kept the experience immersive.

5. Whole body ergonomics are now an important consideration.

Working with a Vive is a much more physical experience than sitting at a desk with a keyboard and mouse, with users standing, moving, gesturing and turning their head to look all around them. This means that for a good experience, the designer must now consider whole body ergonomics.

You’ll need to minimise physical exertion and that can be done by placing controls in comfortable positions, avoiding unnecessarily tiring gestures and too much head-turning. Consider users’ reach envelope – the area they can reach whilst performing their task – for UI placement to avoid them having to step forward or lean in uncomfortably, and ensure the most commonly used controls are closest. Users will also tend to infer significance based on distance, too – objects within the envelope will be assumed to be significant and probably interactive, and those farther away less important.

stretch
Consider reach envelope to avoid repeated actions which require stretching or feel uncomfortable (Image source: http://bit.ly/2duOQR2).

Our application allows people to call up a free-floating UI panels that relate to a work task, and position them in the room, effectively creating a temporary workspace with reference information at hand. In such a scenario, information should be at comfortable viewing height that doesn’t require craning the neck, and sized and designed appropriately that it can be read from a small distance. Buttons to be pushed often for example, shouldn’t be above eye height or arms will get tired quickly.

The Vive allows us to wander around relatively unhindered – however the headset still needs a large cable that unfortunately occasionally wraps around users or becomes a trip hazard. This nuisance is rare but worth noting, particularly if the application encourages users to turn around often.

Those in the UX industry interested in learning more about ergonomics should pickup reference books Ergonomics for Beginners or the tried and tested, Bodyspace.

6. Restrictions on vision will impact UI design and text display.

Whilst the 3D environment means we can look all around us, headsets actually restrict your peripheral vision somewhat, making the viewing cone – and hence area for displaying information – smaller than one might expect. Crisp focus is maintained in the centre of view but can blur at the edges, meaning there are issues reading large bodies of text of signs. On UI panels, it can help to curve them a little – creating a concave surface that wraps around the user so content is all at similar depth.

Focus can also be lost when trying to look at objects very close to the face, and it’s unlikely text or icons will be legible very close-up. We had to ensure controls anchored on our controllers – like our map – were scaled and text legible, when held out with the arms a little.

With VR today, you’re designing for a lower resolution display than most phones or monitors, compounded by the two lenses which effectively halve the display resolution – the Vive is 1080×1020 per eye. This will no doubt increase quickly in future hardware revisions, but today you need large font sizes for text to be legible and may find decorative fonts and scripts difficult to read. We used good old Helvetica and similar sans serifs, of course selecting high contrast colours for good visibility at distance.

Early controller-based main menu panel and object-based context menu.
Early controller-based main menu and object-based context menus.

Lighting is also a consideration. Reading documents for example in VR means they must be lit for good contrast, and like signage, this is linked to the design of environment aroud you.

7. User interface needs to be big, bold and to the point

VR controllers at present don’t allow for fine interactions such as we might get from a mouse cursor, and the experience is more akin to wielding handheld tools with which you push buttons and grab objects. Whilst there are systems in development such as Leap Motion which track individual fingers, using most systems today means designing for less precise interactions. Touchscreens need larger buttons and bounding areas than mice or trackpad-driven interfaces and VR controls need to be larger again, with generous spacing between adjacent buttons. This of course means UI elements will take up more space and allow for less detail, meaning less information is generally in view at any given time.

panel
Prototype UI panel designed for comfortable viewing and use.

The reduced display resolution also means fine lines and graphic elements can become pixellated so it’s better to keep them large and graphically simple. As with text, expect to display a limited amount of info on screen at once and for panels to be large.

8. Heads up displays are harder than you think.

Whilst exploring options for navigation, we tried several HUDs to indicate users’ next waypoint for travel. 2D arrows overlaid on the view were difficult to focus on and relate to 3D space, and tended to irritate by obscuring vision. We felt that rendering a 2D HUD a short distance away would be better and easier to focus on, but this proved difficult to achieve in Unity and so we moved on to 3D options.

hudx
HUD experiment 1: 2D arrow overlays, indicating direction to turn to the next waypoint.
3D navigation arrows in VR
HUD experiment 2: A fixed 3D arrow in the centre of view, and animated arrows flying from the camera to the users’ next waypoint.

3D animated arrows flying from the user’s head to their next waypoint fitted into the environment but created similar issues with obscuring the view. Most successful was a single 3D ‘compass’ indicator that remained fixed in space at forehead level, however in testing it felt strange to have the compass attached to your head.

Inspired by the handheld menus in Google’s Tiltbrush drawing application, we took the compass and instead anchored it to one of our controllers, which proved to be a better solution for our application where users are focused on the environment around them throughout. When people need the indicator, they simply raise it into view like any handheld tool, which is a principle we used again for a map feature.

controllerarrowx
A more successful navigation arrow, this time anchored above the controller.

9. You’ll sketch, test and experiment more

During development, our team found the 3D interfaces and interactions we wanted to describe to one another weren’t easily mocked up in any of the popular UX prototyping tools like Balsamiq, Invision or Axure. As the designer on the team I found myself sketching a lot, both to storyboard moments and describe 3D components.

Once we had an environment in place, I was able to produce quick mockups in Photoshop using screengrabs, sketch overlays and photos of my hands holding controllers. These higher fidelity mockups were useful for discussing visuals and location-specific interactions. If you’re a designer or developer with CAD skills, you can put them to good use here by quickly prototyping 3D interface elements. We used Cinema 4D which imported easily into Unity.

newmap
Quick, Photoshop sketch of a feature idea for team discussion.
vr-captures
The same feature, prototyped and in use in VR.

Throughout our sprints, we’ve gotten prototype elements into the VR environment as soon as possible and tested them with people around the office almost every single day, trying out a wide range of ideas and persisting with those that stood up best to real use. The lack of established standards yet for UX design in VR has meant we’ve needed to experiment and try multiple concepts to achieve a high quality experience.

10.You’ll use video and live demos to explain to others

Describing our application to remote stakeholders has presented its own challenges, as VR and the interactions needed are unfamiliar to most. Whenever possible you should absolutely bring everyone around to try things on the Vive, but DNV GL is a large firm and many of our colleagues are in other countries.

We usually handle this by screen sharing or sending links to Axure wireframes, but as not everyone has VR hardware access, we’ve needed to get creative. We’ve found a private youtube channel showing video footage of new features in use, and added commentary, to be a fast and effective communication method. It’s also also formed a useful record of development progress we can refer back to later.

Our user story map has helped the wider team to understand context and how features will be used to complete tasks. As the videos are recorded from a first-person perspective, we also found storyboard illustrations showing the user in context help to communicate what’s in the space around them. Initially challenging, communication has gotten easier as we moved from prototypes of single components to later builds, where combined features can be seen in use and the story better understood.

forarticle
Sample illustration to help stakeholders understand the product story.

Summary

1. Much of the established wisdom goes out the window
The medium hasn’t yet established design standards for interactions or process. Now the screen is gone, be prepared to overturn some longstanding ideas about good UX design.

2. Motion sickness happens for some users, but you can design it out
Low frame rates have more serious consequences in VR, stay above 90fps.
Involuntary motion maintains orientation but can cause discomfort. ‘Blink’ transitions are comfortable for more users but there is some orientation trade-off. If in doubt, try both and test to decide what’s best for your application.

3. Intuitive VR controls are physical and mimic the real world
Create UIs with from physical objects or panels with depth, making use of real world controls and semantics wherever possible. Controls and metadata are attached to objects, and grouped objects show visible links.

4. Wayfinding works best as part of the environment
Use the environment itself to draw users’ gaze, signpost for orientation and help them navigate. Placed at comfortable viewing hight, signage is intuitive and maintains immersion.

5. Whole body ergonomics are now an important consideration
Consider reach envelope and neck movement when positioning controls comfortably. Avoid placing high objects very high or far away, and implement simple gestures that won’t get tiring with repeated use. The Vive headset cable is an occasional nuisance.

6. Restrictions on vision will impact UI design and text display
Viewing cone and resolution are limited. Plan to present less information on screen and for it to be centre view. Fonts should be sans serif, large size and high contrast. Using curved, concave UI panels will help keep all content in focus.

7. User interface needs to be big, bold and to the point
Cater for less precise controller interactions by keeping UI controls large and good spacing in between them. Icons – much like text – should be large, graphically simple and not use fine line widths.

8. Heads up displays are harder than you think
HUDs can obscure the view and feel out of place without some depth. Our team decided to anchor UI elements on the controllers instead, but try and test to find out what’s right for your application.

9. You’ll sketch, test and experiment more
Avoid popular prototyping software (for now) and instead use sketches and Photoshop mockups to communicate early ideas. If you have CAD skills, use them to build elements quickly for prototyping. Plan to experiment, trying multiple concepts to achieve a good quality experience.

10.You’ll use video and live demos to explain to others
Use video capture and live demos whenever possible to communicate updates to remote stakeholders. A user story map and sketch storyboards will help to give context and describe the surrounding environment.