Home / Issues / 16.2019
Document Actions

16.2019

Up one level
  1. 2020-10-06

    Designing Mobile Multimodal Interaction for Visually Impaired and Older Adults: Challenges and Possible Solutions

    This paper presents two early studies aimed at investigating issues concerning the design of multimodal interaction based on voice commands and one-hand mid-air gestures - with mobile technology specifically designed for visually impaired and elderly users. These studies were carried out on a new device allowing enhanced speech recognition (including lip movement analysis) and mid-air gesture interaction on Android operating system (smartphone and tablet PC). We discuss the initial findings and challenges raised by these novel interaction modalities, and in particular the issues regarding the design of feedback and feedforward, the problem of false positives, and the correct orientation and distance of the hand and the device during the interaction. Finally, we present a set of feedback and feedforward solutions designed to overcome the main issues highlighted.

    JVRB, 16(2019), no. 2.

EuroVR 2016
  1. 2021-05-04

    Evaluating the Effects of Visual Fidelity and Magnified View on User Experience in Virtual Reality Games

    Virtual reality has been becoming more affordable in recent years. This led to more content specifically developed for this medium. Training with virtual reality is one of the promising areas in terms of the benefits. Virtual reality properties may affect user performance. This study aims at exploring effects of visual fidelity (high and low) and view zoom (normal and magnified) on task performance in virtual reality. Effects of visual fidelity have previously been explored but yielded different results based on the task design. Effects of view zoom on task performance haven’t been explored yet. An inspection task in virtual reality was developed and a user study was performed with 15 participants. Results indicated that low visual fidelity led to better task performance whereas view zoom did not have an effect on the performance.

    JVRB, (16)2019, no. 1.

VISIGRAPP 2019
  1. 2021-08-10

    A Sketch-based Interface for Real-time Control of Crowd Simulations that incorporate Dynamic Knowledge

    Controlling crowd simulations typically involves tweaking complex parameter sets to attempt to reach a desired outcome, which can be unintuitive for non- technical users. This paper presents an approach to control pedestrian simulations in real time via sketching. Users are able to create entrances/exits, barriers to block paths, flow lines to guide pedestrians, waypoint areas, and storyboards to specify the journeys of crowd subgroups. Additionally, a timeline interface can be used to control when simulation events occur. The sketching approach is supported by a tiled navigation mesh (navmesh), based on the open source tool RE- CAST, to support pedestrian navigation. The navmesh is updated in real time based on the user’s sketches and the simulation updates accordingly. A comparison between our navmesh approach and the more often used grid-based navigation approach is given, showing that the navmesh approach scales better for large environments. The paper also presents possible solutions to address the question of when pedestrians should react to real-time changes to the environment, whether or not these changes are in their field of vision. The effectiveness of the system is demonstrated with a set of scenarios and a practical application which make use of a 3D model of an area of a UK city centre created using data from OPENSTREETMAP.

    JVRB, 16(2019), no. 3.