Home / Issues / 16.2019 / Designing Mobile Multimodal Interaction for Visually Impaired and Older Adults: Challenges and Possible Solutions
Document Actions

Citation and metadata

Recommended citation

Michela Ferron, Nadia Mana, Ornella Mich, Christopher Reeves, and Gianluca Schiavo, Designing Mobile Multimodal Interaction for Visually Impaired and Older Adults: Challenges and Possible Solutions. Journal of Virtual Reality and Broadcasting, 16(2019), no. 2. (urn:nbn:de:0009-6-49583)

Download Citation

Endnote

%0 Journal Article
%T Designing Mobile Multimodal Interaction for Visually Impaired and Older Adults: Challenges and Possible Solutions
%A Ferron, Michela
%A Mana, Nadia
%A Mich, Ornella
%A Reeves, Christopher
%A Schiavo, Gianluca
%J Journal of Virtual Reality and Broadcasting
%D 2020
%V 16(2019)
%N 2
%@ 1860-2037
%F ferron2020
%X This paper presents two early studies aimed at investigating issues concerning the design of multimodal interaction based on voice commands and one-hand mid-air gestures - with mobile technology specifically designed for visually impaired and elderly users. These studies were carried out on a new device allowing enhanced speech recognition (including lip movement analysis) and mid-air gesture interaction on Android operating system (smartphone and tablet PC). We discuss the initial findings and challenges raised by these novel interaction modalities, and in particular the issues regarding the design of feedback and feedforward, the problem of false positives, and the correct orientation and distance of the hand and the device during the interaction. Finally, we present a set of feedback and feedforward solutions designed to overcome the main issues highlighted.
%L 004
%K Augmented Reality (AR)
%K Human-Computer Interaction
%K Multimodal Interaction
%K Visually Impaired
%R 10.20385/1860-2037/16.2019.2
%U http://nbn-resolving.de/urn:nbn:de:0009-6-49583
%U http://dx.doi.org/10.20385/1860-2037/16.2019.2

Download

Bibtex

@Article{ferron2020,
  author = 	"Ferron, Michela
		and Mana, Nadia
		and Mich, Ornella
		and Reeves, Christopher
		and Schiavo, Gianluca",
  title = 	"Designing Mobile Multimodal Interaction for Visually Impaired and Older Adults: Challenges and Possible Solutions",
  journal = 	"Journal of Virtual Reality and Broadcasting",
  year = 	"2020",
  volume = 	"16(2019)",
  number = 	"2",
  keywords = 	"Augmented Reality (AR); Human-Computer Interaction; Multimodal Interaction; Visually Impaired",
  abstract = 	"This paper presents two early studies aimed at investigating issues concerning the design of multimodal interaction based on voice commands and one-hand mid-air gestures - with mobile technology specifically designed for visually impaired and elderly users. These studies were carried out on a new device allowing enhanced speech recognition (including lip movement analysis) and mid-air gesture interaction on Android operating system (smartphone and tablet PC). We discuss the initial findings and challenges raised by these novel interaction modalities, and in particular the issues regarding the design of feedback and feedforward, the problem of false positives, and the correct orientation and distance of the hand and the device during the interaction. Finally, we present a set of feedback and feedforward solutions designed to overcome the main issues highlighted.",
  issn = 	"1860-2037",
  doi = 	"10.20385/1860-2037/16.2019.2",
  url = 	"http://nbn-resolving.de/urn:nbn:de:0009-6-49583"
}

Download

RIS

TY  - JOUR
AU  - Ferron, Michela
AU  - Mana, Nadia
AU  - Mich, Ornella
AU  - Reeves, Christopher
AU  - Schiavo, Gianluca
PY  - 2020
DA  - 2020//
TI  - Designing Mobile Multimodal Interaction for Visually Impaired and Older Adults: Challenges and Possible Solutions
JO  - Journal of Virtual Reality and Broadcasting
VL  - 16(2019)
IS  - 2
KW  - Augmented Reality (AR)
KW  - Human-Computer Interaction
KW  - Multimodal Interaction
KW  - Visually Impaired
AB  - This paper presents two early studies aimed at investigating issues concerning the design of multimodal interaction based on voice commands and one-hand mid-air gestures - with mobile technology specifically designed for visually impaired and elderly users. These studies were carried out on a new device allowing enhanced speech recognition (including lip movement analysis) and mid-air gesture interaction on Android operating system (smartphone and tablet PC). We discuss the initial findings and challenges raised by these novel interaction modalities, and in particular the issues regarding the design of feedback and feedforward, the problem of false positives, and the correct orientation and distance of the hand and the device during the interaction. Finally, we present a set of feedback and feedforward solutions designed to overcome the main issues highlighted.
SN  - 1860-2037
UR  - http://nbn-resolving.de/urn:nbn:de:0009-6-49583
DO  - 10.20385/1860-2037/16.2019.2
ID  - ferron2020
ER  - 
Download

Wordbib

<?xml version="1.0" encoding="UTF-8"?>
<b:Sources SelectedStyle="" xmlns:b="http://schemas.openxmlformats.org/officeDocument/2006/bibliography"  xmlns="http://schemas.openxmlformats.org/officeDocument/2006/bibliography" >
<b:Source>
<b:Tag>ferron2020</b:Tag>
<b:SourceType>ArticleInAPeriodical</b:SourceType>
<b:Year>2020</b:Year>
<b:PeriodicalTitle>Journal of Virtual Reality and Broadcasting</b:PeriodicalTitle>
<b:Volume>16(2019)</b:Volume>
<b:Issue>2</b:Issue>
<b:Url>http://nbn-resolving.de/urn:nbn:de:0009-6-49583</b:Url>
<b:Url>http://dx.doi.org/10.20385/1860-2037/16.2019.2</b:Url>
<b:Author>
<b:Author><b:NameList>
<b:Person><b:Last>Ferron</b:Last><b:First>Michela</b:First></b:Person>
<b:Person><b:Last>Mana</b:Last><b:First>Nadia</b:First></b:Person>
<b:Person><b:Last>Mich</b:Last><b:First>Ornella</b:First></b:Person>
<b:Person><b:Last>Reeves</b:Last><b:First>Christopher</b:First></b:Person>
<b:Person><b:Last>Schiavo</b:Last><b:First>Gianluca</b:First></b:Person>
</b:NameList></b:Author>
</b:Author>
<b:Title>Designing Mobile Multimodal Interaction for Visually Impaired and Older Adults: Challenges and Possible Solutions</b:Title>
<b:Comments>This paper presents two early studies aimed at investigating issues concerning the design of multimodal interaction based on voice commands and one-hand mid-air gestures - with mobile technology specifically designed for visually impaired and elderly users. These studies were carried out on a new device allowing enhanced speech recognition (including lip movement analysis) and mid-air gesture interaction on Android operating system (smartphone and tablet PC). We discuss the initial findings and challenges raised by these novel interaction modalities, and in particular the issues regarding the design of feedback and feedforward, the problem of false positives, and the correct orientation and distance of the hand and the device during the interaction. Finally, we present a set of feedback and feedforward solutions designed to overcome the main issues highlighted.</b:Comments>
</b:Source>
</b:Sources>
Download

ISI

PT Journal
AU Ferron, M
   Mana, N
   Mich, O
   Reeves, C
   Schiavo, G
TI Designing Mobile Multimodal Interaction for Visually Impaired and Older Adults: Challenges and Possible Solutions
SO Journal of Virtual Reality and Broadcasting
PY 2020
VL 16(2019)
IS 2
DI 10.20385/1860-2037/16.2019.2
DE Augmented Reality (AR); Human-Computer Interaction; Multimodal Interaction; Visually Impaired
AB This paper presents two early studies aimed at investigating issues concerning the design of multimodal interaction based on voice commands and one-hand mid-air gestures - with mobile technology specifically designed for visually impaired and elderly users. These studies were carried out on a new device allowing enhanced speech recognition (including lip movement analysis) and mid-air gesture interaction on Android operating system (smartphone and tablet PC). We discuss the initial findings and challenges raised by these novel interaction modalities, and in particular the issues regarding the design of feedback and feedforward, the problem of false positives, and the correct orientation and distance of the hand and the device during the interaction. Finally, we present a set of feedback and feedforward solutions designed to overcome the main issues highlighted.
ER

Download

Mods

<mods>
  <titleInfo>
    <title>Designing Mobile Multimodal Interaction for Visually Impaired and Older Adults: Challenges and Possible Solutions</title>
  </titleInfo>
  <name type="personal">
    <namePart type="family">Ferron</namePart>
    <namePart type="given">Michela</namePart>
  </name>
  <name type="personal">
    <namePart type="family">Mana</namePart>
    <namePart type="given">Nadia</namePart>
  </name>
  <name type="personal">
    <namePart type="family">Mich</namePart>
    <namePart type="given">Ornella</namePart>
  </name>
  <name type="personal">
    <namePart type="family">Reeves</namePart>
    <namePart type="given">Christopher</namePart>
  </name>
  <name type="personal">
    <namePart type="family">Schiavo</namePart>
    <namePart type="given">Gianluca</namePart>
  </name>
  <abstract>This paper presents two early studies aimed at investigating issues concerning the design of multimodal interaction based on voice commands and one-hand mid-air gestures - with mobile technology specifically designed for visually impaired and elderly users. These studies were carried out on a new device allowing enhanced speech recognition (including lip movement analysis) and mid-air gesture interaction on Android operating system (smartphone and tablet PC). We discuss the initial findings and challenges raised by these novel interaction modalities, and in particular the issues regarding the design of feedback and feedforward, the problem of false positives, and the correct orientation and distance of the hand and the device during the interaction. Finally, we present a set of feedback and feedforward solutions designed to overcome the main issues highlighted.</abstract>
  <subject>
    <topic>Augmented Reality (AR)</topic>
    <topic>Human-Computer Interaction</topic>
    <topic>Multimodal Interaction</topic>
    <topic>Visually Impaired</topic>
  </subject>
  <classification authority="ddc">004</classification>
  <relatedItem type="host">
    <genre authority="marcgt">periodical</genre>
    <genre>academic journal</genre>
    <titleInfo>
      <title>Journal of Virtual Reality and Broadcasting</title>
    </titleInfo>
    <part>
      <detail type="volume">
        <number>16(2019)</number>
      </detail>
      <detail type="issue">
        <number>2</number>
      </detail>
      <date>2020</date>
    </part>
  </relatedItem>
  <identifier type="issn">1860-2037</identifier>
  <identifier type="urn">urn:nbn:de:0009-6-49583</identifier>
  <identifier type="doi">10.20385/1860-2037/16.2019.2</identifier>
  <identifier type="uri">http://nbn-resolving.de/urn:nbn:de:0009-6-49583</identifier>
  <identifier type="citekey">ferron2020</identifier>
</mods>
Download

Full Metadata