Interaction options

Interactions through a mouse or keyboard have given way to direct interaction with displays through touch and voice. Immersive technologies (AR, MR and VR) and wearable displays enable even more intuitive interfaces between people and digital content by supporting natural modes of interaction, such as gesture and gaze. As experts of interactive experiences that bridge the physical and digital worlds, we apply different interaction techniques in our solutions.

Touch and multitouch

Touch-sensitive displays allow interactive control through tapping, dragging, sliding, pinching, zooming, panning, rotating and other manipulations.

Our solutions leverage both direct and indirect touch control of digital content.

Indirect control involves solutions, where users interact with content on another screen – holographic display, large window or wall display etc. – via mobile devices. We can use QR codes, RFID tags, BLE beacons or web links to turn the user’s mobile device into a ‘remote control’ for the other screen.

Depending on display type, direct touch control of digital content is possible either by a single user at a time or by multiple users simultaneously on multitouch displays. Direct touch interaction can also be combined with mobile control to give users a chance to continue the interactive experience on the touchscreen of their personal device.

Gestures

Hands are commonly used for gesture-based interactions – either alone or with a hand-held device. Grabbing, moving, resizing and rotating virtual objects or selecting and activating virtual content through air tapping are examples of intuitive gesture inputs in immersive AR/MR/VR environments. Gesture control requires high-precision position tracking of user’s head and hands either through external cameras and sensors or through tracking systems incorporated in head-mounted displays.

For less immersive applications, we can use Microsoft’s Kinect technology to enable gesture interaction. Full-body gesture control of animated virtual characters in a gamified experience is an example of such applications. An even more straightforward alternative is to use motion detection technology: People passing by or standing within a specified range trigger digital content by their mere presence.

Gaze

Gaze control is typically faster and more effortless than other types of interactive inputs. In the simplest form of gaze interaction, you can select and activate an interactive object by looking at it and holding your gaze on it for a while. Gaze control can also be combined with other – more explicit – inputs, such as button presses or touch input. Gaze control enables seamless interaction in VR experiences or with augmented and mixed reality content placed in the real world. Gaze-based interactions can be used to select something and trigger action, expand a text box to see more information, scroll in a document etc. Gaze interaction is supported by eye or head tracking technology.

Interactive buttons or hotspots

Interactive buttons or hotspots are typical interface elements embedded in digital content to enable interaction. When selected by for example click, touch or gaze, they trigger an action. Watch the following video and see how hotspots work in a simple interactive presentation.