Home / Issues / 11.2014 / Application of Time-Delay Estimation to Mixed Reality Multisensor Tracking
Document Actions

Citation and metadata

Recommended citation

Manuel Huber, Michael Schlegel, and Gudrun Klinker, Application of Time-Delay Estimation to Mixed Reality Multisensor Tracking. Journal of Virtual Reality and Broadcasting, 11(2014), no. 3. (urn:nbn:de:0009-6-38778)

Download Citation

Endnote

%0 Journal Article
%T Application of Time-Delay Estimation to Mixed Reality Multisensor Tracking
%A Huber, Manuel
%A Schlegel, Michael
%A Klinker, Gudrun
%J Journal of Virtual Reality and Broadcasting
%D 2014
%V 11(2014)
%N 3
%@ 1860-2037
%F huber2014
%X Spatial tracking is one of the most challenging and important parts of Mixed Reality environments. Many applications, especially in the domain of Augmented Reality, rely on the fusion of several tracking systems in order to optimize the overall performance. While the topic of spatial tracking sensor fusion has already seen considerable interest, most results only deal with the integration of carefully arranged setups as opposed to dynamic sensor fusion setups.A crucial prerequisite for correct sensor fusion is the temporal alignment of the tracking data from several sensors. Tracking sensors are typically encountered in Mixed Reality applications, are generally not synchronized. We present a general method to calibrate the temporal offset between different sensors by the Time Delay Estimation method which can be used to perform on-line temporal calibration. By applying Time Delay Estimation on the tracking data, we show that the temporal offset between generic Mixed Reality spatial tracking sensors can be calibrated.To show the correctness and the feasibility of this approach, we have examined different variations of our method and evaluated various combinations of tracking sensors. We furthermore integrated this time synchronization method into our UBITRACK Mixed Reality tracking framework to provide facilities for calibration and real-time data alignment.
%L 004
%K calibration
%K sensor fusion
%K synchronization
%K tracking
%K ubiquitous tracking
%R 10.20385/1860-2037/11.2014.3
%U http://nbn-resolving.de/urn:nbn:de:0009-6-38778
%U http://dx.doi.org/10.20385/1860-2037/11.2014.3

Download

Bibtex

@Article{huber2014,
  author = 	"Huber, Manuel
		and Schlegel, Michael
		and Klinker, Gudrun",
  title = 	"Application of Time-Delay Estimation to Mixed Reality Multisensor Tracking",
  journal = 	"Journal of Virtual Reality and Broadcasting",
  year = 	"2014",
  volume = 	"11(2014)",
  number = 	"3",
  keywords = 	"calibration; sensor fusion; synchronization; tracking; ubiquitous tracking",
  abstract = 	"Spatial tracking is one of the most challenging and important parts of Mixed Reality environments. Many applications, especially in the domain of Augmented Reality, rely on the fusion of several tracking systems in order to optimize the overall performance. While the topic of spatial tracking sensor fusion has already seen considerable interest, most results only deal with the integration of carefully arranged setups as opposed to dynamic sensor fusion setups.A crucial prerequisite for correct sensor fusion is the temporal alignment of the tracking data from several sensors. Tracking sensors are typically encountered in Mixed Reality applications, are generally not synchronized. We present a general method to calibrate the temporal offset between different sensors by the Time Delay Estimation method which can be used to perform on-line temporal calibration. By applying Time Delay Estimation on the tracking data, we show that the temporal offset between generic Mixed Reality spatial tracking sensors can be calibrated.To show the correctness and the feasibility of this approach, we have examined different variations of our method and evaluated various combinations of tracking sensors. We furthermore integrated this time synchronization method into our UBITRACK Mixed Reality tracking framework to provide facilities for calibration and real-time data alignment.",
  issn = 	"1860-2037",
  doi = 	"10.20385/1860-2037/11.2014.3",
  url = 	"http://nbn-resolving.de/urn:nbn:de:0009-6-38778"
}

Download

RIS

TY  - JOUR
AU  - Huber, Manuel
AU  - Schlegel, Michael
AU  - Klinker, Gudrun
PY  - 2014
DA  - 2014//
TI  - Application of Time-Delay Estimation to Mixed Reality Multisensor Tracking
JO  - Journal of Virtual Reality and Broadcasting
VL  - 11(2014)
IS  - 3
KW  - calibration
KW  - sensor fusion
KW  - synchronization
KW  - tracking
KW  - ubiquitous tracking
AB  - Spatial tracking is one of the most challenging and important parts of Mixed Reality environments. Many applications, especially in the domain of Augmented Reality, rely on the fusion of several tracking systems in order to optimize the overall performance. While the topic of spatial tracking sensor fusion has already seen considerable interest, most results only deal with the integration of carefully arranged setups as opposed to dynamic sensor fusion setups.A crucial prerequisite for correct sensor fusion is the temporal alignment of the tracking data from several sensors. Tracking sensors are typically encountered in Mixed Reality applications, are generally not synchronized. We present a general method to calibrate the temporal offset between different sensors by the Time Delay Estimation method which can be used to perform on-line temporal calibration. By applying Time Delay Estimation on the tracking data, we show that the temporal offset between generic Mixed Reality spatial tracking sensors can be calibrated.To show the correctness and the feasibility of this approach, we have examined different variations of our method and evaluated various combinations of tracking sensors. We furthermore integrated this time synchronization method into our UBITRACK Mixed Reality tracking framework to provide facilities for calibration and real-time data alignment.
SN  - 1860-2037
UR  - http://nbn-resolving.de/urn:nbn:de:0009-6-38778
DO  - 10.20385/1860-2037/11.2014.3
ID  - huber2014
ER  - 
Download

Wordbib

<?xml version="1.0" encoding="UTF-8"?>
<b:Sources SelectedStyle="" xmlns:b="http://schemas.openxmlformats.org/officeDocument/2006/bibliography"  xmlns="http://schemas.openxmlformats.org/officeDocument/2006/bibliography" >
<b:Source>
<b:Tag>huber2014</b:Tag>
<b:SourceType>ArticleInAPeriodical</b:SourceType>
<b:Year>2014</b:Year>
<b:PeriodicalTitle>Journal of Virtual Reality and Broadcasting</b:PeriodicalTitle>
<b:Volume>11(2014)</b:Volume>
<b:Issue>3</b:Issue>
<b:Url>http://nbn-resolving.de/urn:nbn:de:0009-6-38778</b:Url>
<b:Url>http://dx.doi.org/10.20385/1860-2037/11.2014.3</b:Url>
<b:Author>
<b:Author><b:NameList>
<b:Person><b:Last>Huber</b:Last><b:First>Manuel</b:First></b:Person>
<b:Person><b:Last>Schlegel</b:Last><b:First>Michael</b:First></b:Person>
<b:Person><b:Last>Klinker</b:Last><b:First>Gudrun</b:First></b:Person>
</b:NameList></b:Author>
</b:Author>
<b:Title>Application of Time-Delay Estimation to Mixed Reality Multisensor Tracking</b:Title>
<b:Comments>Spatial tracking is one of the most challenging and important parts of Mixed Reality environments. Many applications, especially in the domain of Augmented Reality, rely on the fusion of several tracking systems in order to optimize the overall performance. While the topic of spatial tracking sensor fusion has already seen considerable interest, most results only deal with the integration of carefully arranged setups as opposed to dynamic sensor fusion setups.A crucial prerequisite for correct sensor fusion is the temporal alignment of the tracking data from several sensors. Tracking sensors are typically encountered in Mixed Reality applications, are generally not synchronized. We present a general method to calibrate the temporal offset between different sensors by the Time Delay Estimation method which can be used to perform on-line temporal calibration. By applying Time Delay Estimation on the tracking data, we show that the temporal offset between generic Mixed Reality spatial tracking sensors can be calibrated.To show the correctness and the feasibility of this approach, we have examined different variations of our method and evaluated various combinations of tracking sensors. We furthermore integrated this time synchronization method into our UBITRACK Mixed Reality tracking framework to provide facilities for calibration and real-time data alignment.</b:Comments>
</b:Source>
</b:Sources>
Download

ISI

PT Journal
AU Huber, M
   Schlegel, M
   Klinker, G
TI Application of Time-Delay Estimation to Mixed Reality Multisensor Tracking
SO Journal of Virtual Reality and Broadcasting
PY 2014
VL 11(2014)
IS 3
DI 10.20385/1860-2037/11.2014.3
DE calibration; sensor fusion; synchronization; tracking; ubiquitous tracking
AB Spatial tracking is one of the most challenging and important parts of Mixed Reality environments. Many applications, especially in the domain of Augmented Reality, rely on the fusion of several tracking systems in order to optimize the overall performance. While the topic of spatial tracking sensor fusion has already seen considerable interest, most results only deal with the integration of carefully arranged setups as opposed to dynamic sensor fusion setups.A crucial prerequisite for correct sensor fusion is the temporal alignment of the tracking data from several sensors. Tracking sensors are typically encountered in Mixed Reality applications, are generally not synchronized. We present a general method to calibrate the temporal offset between different sensors by the Time Delay Estimation method which can be used to perform on-line temporal calibration. By applying Time Delay Estimation on the tracking data, we show that the temporal offset between generic Mixed Reality spatial tracking sensors can be calibrated.To show the correctness and the feasibility of this approach, we have examined different variations of our method and evaluated various combinations of tracking sensors. We furthermore integrated this time synchronization method into our UBITRACK Mixed Reality tracking framework to provide facilities for calibration and real-time data alignment.
ER

Download

Mods

<mods>
  <titleInfo>
    <title>Application of Time-Delay Estimation to Mixed Reality Multisensor Tracking</title>
  </titleInfo>
  <name type="personal">
    <namePart type="family">Huber</namePart>
    <namePart type="given">Manuel</namePart>
  </name>
  <name type="personal">
    <namePart type="family">Schlegel</namePart>
    <namePart type="given">Michael</namePart>
  </name>
  <name type="personal">
    <namePart type="family">Klinker</namePart>
    <namePart type="given">Gudrun</namePart>
  </name>
  <abstract>Spatial tracking is one of the most challenging and important parts of Mixed Reality environments. Many applications, especially in the domain of Augmented Reality, rely on the fusion of several tracking systems in order to optimize the overall performance. While the topic of spatial tracking sensor fusion has already seen considerable interest, most results only deal with the integration of carefully arranged setups as opposed to dynamic sensor fusion setups.

A crucial prerequisite for correct sensor fusion is the temporal alignment of the tracking data from several sensors. Tracking sensors are typically encountered in Mixed Reality applications, are generally not synchronized. We present a general method to calibrate the temporal offset between different sensors by the Time Delay Estimation method which can be used to perform on-line temporal calibration. By applying Time Delay Estimation on the tracking data, we show that the temporal offset between generic Mixed Reality spatial tracking sensors can be calibrated.

To show the correctness and the feasibility of this approach, we have examined different variations of our method and evaluated various combinations of tracking sensors. We furthermore integrated this time synchronization method into our UBITRACK Mixed Reality tracking framework to provide facilities for calibration and real-time data alignment.</abstract>
  <subject>
    <topic>calibration</topic>
    <topic>sensor fusion</topic>
    <topic>synchronization</topic>
    <topic>tracking</topic>
    <topic>ubiquitous tracking</topic>
  </subject>
  <classification authority="ddc">004</classification>
  <relatedItem type="host">
    <genre authority="marcgt">periodical</genre>
    <genre>academic journal</genre>
    <titleInfo>
      <title>Journal of Virtual Reality and Broadcasting</title>
    </titleInfo>
    <part>
      <detail type="volume">
        <number>11(2014)</number>
      </detail>
      <detail type="issue">
        <number>3</number>
      </detail>
      <date>2014</date>
    </part>
  </relatedItem>
  <identifier type="issn">1860-2037</identifier>
  <identifier type="urn">urn:nbn:de:0009-6-38778</identifier>
  <identifier type="doi">10.20385/1860-2037/11.2014.3</identifier>
  <identifier type="uri">http://nbn-resolving.de/urn:nbn:de:0009-6-38778</identifier>
  <identifier type="citekey">huber2014</identifier>
</mods>
Download

Full Metadata