Citation and metadata
Recommended citation
Christian Bailer, Alain Pagani, and Didier Stricker, A user supported object tracking framework for interactive video production. Journal of Virtual Reality and Broadcasting, 11(2014), no. 9. (urn:nbn:de:0009-6-40305)
Download Citation
Endnote
%0 Journal Article %T A user supported object tracking framework for interactive video production %A Bailer, Christian %A Pagani, Alain %A Stricker, Didier %J Journal of Virtual Reality and Broadcasting %D 2014 %V 11(2014) %N 9 %@ 1860-2037 %F bailer2014 %X We present a user supported tracking framework that combines automatic tracking with extended user input to create error free tracking results that are suitable for interactive video production. The goal of our approach is to keep the necessary user input as small as possible. In our framework, the user can select between different tracking algorithms - existing ones and new ones that are described in this paper. Furthermore, the user can automatically fuse the results of different tracking algorithms with our robust fusion approach. The tracked object can be marked in more than one frame, which can significantly improve the tracking result. After tracking, the user can validate the results in an easy way, thanks to the support of a powerful interpolation technique. The tracking results are iteratively improved until the complete track has been found. After the iterative editing process the tracking result of each object is stored in an interactive video file that can be loaded by our player for interactive videos. %L 004 %K data fusion %K interactive tracking %K interactive video %K tracking methods %K web applications %R 10.20385/1860-2037/11.2014.9 %U http://nbn-resolving.de/urn:nbn:de:0009-6-40305 %U http://dx.doi.org/10.20385/1860-2037/11.2014.9Download
Bibtex
@Article{bailer2014, author = "Bailer, Christian and Pagani, Alain and Stricker, Didier", title = "A user supported object tracking framework for interactive video production", journal = "Journal of Virtual Reality and Broadcasting", year = "2014", volume = "11(2014)", number = "9", keywords = "data fusion; interactive tracking; interactive video; tracking methods; web applications", abstract = "We present a user supported tracking framework that combines automatic tracking with extended user input to create error free tracking results that are suitable for interactive video production. The goal of our approach is to keep the necessary user input as small as possible. In our framework, the user can select between different tracking algorithms - existing ones and new ones that are described in this paper. Furthermore, the user can automatically fuse the results of different tracking algorithms with our robust fusion approach. The tracked object can be marked in more than one frame, which can significantly improve the tracking result. After tracking, the user can validate the results in an easy way, thanks to the support of a powerful interpolation technique. The tracking results are iteratively improved until the complete track has been found. After the iterative editing process the tracking result of each object is stored in an interactive video file that can be loaded by our player for interactive videos.", issn = "1860-2037", doi = "10.20385/1860-2037/11.2014.9", url = "http://nbn-resolving.de/urn:nbn:de:0009-6-40305" }Download
RIS
TY - JOUR AU - Bailer, Christian AU - Pagani, Alain AU - Stricker, Didier PY - 2014 DA - 2014// TI - A user supported object tracking framework for interactive video production JO - Journal of Virtual Reality and Broadcasting VL - 11(2014) IS - 9 KW - data fusion KW - interactive tracking KW - interactive video KW - tracking methods KW - web applications AB - We present a user supported tracking framework that combines automatic tracking with extended user input to create error free tracking results that are suitable for interactive video production. The goal of our approach is to keep the necessary user input as small as possible. In our framework, the user can select between different tracking algorithms - existing ones and new ones that are described in this paper. Furthermore, the user can automatically fuse the results of different tracking algorithms with our robust fusion approach. The tracked object can be marked in more than one frame, which can significantly improve the tracking result. After tracking, the user can validate the results in an easy way, thanks to the support of a powerful interpolation technique. The tracking results are iteratively improved until the complete track has been found. After the iterative editing process the tracking result of each object is stored in an interactive video file that can be loaded by our player for interactive videos. SN - 1860-2037 UR - http://nbn-resolving.de/urn:nbn:de:0009-6-40305 DO - 10.20385/1860-2037/11.2014.9 ID - bailer2014 ER -Download
Wordbib
<?xml version="1.0" encoding="UTF-8"?> <b:Sources SelectedStyle="" xmlns:b="http://schemas.openxmlformats.org/officeDocument/2006/bibliography" xmlns="http://schemas.openxmlformats.org/officeDocument/2006/bibliography" > <b:Source> <b:Tag>bailer2014</b:Tag> <b:SourceType>ArticleInAPeriodical</b:SourceType> <b:Year>2014</b:Year> <b:PeriodicalTitle>Journal of Virtual Reality and Broadcasting</b:PeriodicalTitle> <b:Volume>11(2014)</b:Volume> <b:Issue>9</b:Issue> <b:Url>http://nbn-resolving.de/urn:nbn:de:0009-6-40305</b:Url> <b:Url>http://dx.doi.org/10.20385/1860-2037/11.2014.9</b:Url> <b:Author> <b:Author><b:NameList> <b:Person><b:Last>Bailer</b:Last><b:First>Christian</b:First></b:Person> <b:Person><b:Last>Pagani</b:Last><b:First>Alain</b:First></b:Person> <b:Person><b:Last>Stricker</b:Last><b:First>Didier</b:First></b:Person> </b:NameList></b:Author> </b:Author> <b:Title>A user supported object tracking framework for interactive video production</b:Title> <b:Comments>We present a user supported tracking framework that combines automatic tracking with extended user input to create error free tracking results that are suitable for interactive video production. The goal of our approach is to keep the necessary user input as small as possible. In our framework, the user can select between different tracking algorithms - existing ones and new ones that are described in this paper. Furthermore, the user can automatically fuse the results of different tracking algorithms with our robust fusion approach. The tracked object can be marked in more than one frame, which can significantly improve the tracking result. After tracking, the user can validate the results in an easy way, thanks to the support of a powerful interpolation technique. The tracking results are iteratively improved until the complete track has been found. After the iterative editing process the tracking result of each object is stored in an interactive video file that can be loaded by our player for interactive videos.</b:Comments> </b:Source> </b:Sources>Download
ISI
PT Journal AU Bailer, C Pagani, A Stricker, D TI A user supported object tracking framework for interactive video production SO Journal of Virtual Reality and Broadcasting PY 2014 VL 11(2014) IS 9 DI 10.20385/1860-2037/11.2014.9 DE data fusion; interactive tracking; interactive video; tracking methods; web applications AB We present a user supported tracking framework that combines automatic tracking with extended user input to create error free tracking results that are suitable for interactive video production. The goal of our approach is to keep the necessary user input as small as possible. In our framework, the user can select between different tracking algorithms - existing ones and new ones that are described in this paper. Furthermore, the user can automatically fuse the results of different tracking algorithms with our robust fusion approach. The tracked object can be marked in more than one frame, which can significantly improve the tracking result. After tracking, the user can validate the results in an easy way, thanks to the support of a powerful interpolation technique. The tracking results are iteratively improved until the complete track has been found. After the iterative editing process the tracking result of each object is stored in an interactive video file that can be loaded by our player for interactive videos. ERDownload
Mods
<mods> <titleInfo> <title>A user supported object tracking framework for interactive video production</title> </titleInfo> <name type="personal"> <namePart type="family">Bailer</namePart> <namePart type="given">Christian</namePart> </name> <name type="personal"> <namePart type="family">Pagani</namePart> <namePart type="given">Alain</namePart> </name> <name type="personal"> <namePart type="family">Stricker</namePart> <namePart type="given">Didier</namePart> </name> <abstract>We present a user supported tracking framework that combines automatic tracking with extended user input to create error free tracking results that are suitable for interactive video production. The goal of our approach is to keep the necessary user input as small as possible. In our framework, the user can select between different tracking algorithms - existing ones and new ones that are described in this paper. Furthermore, the user can automatically fuse the results of different tracking algorithms with our robust fusion approach. The tracked object can be marked in more than one frame, which can significantly improve the tracking result. After tracking, the user can validate the results in an easy way, thanks to the support of a powerful interpolation technique. The tracking results are iteratively improved until the complete track has been found. After the iterative editing process the tracking result of each object is stored in an interactive video file that can be loaded by our player for interactive videos.</abstract> <subject> <topic>data fusion</topic> <topic>interactive tracking</topic> <topic>interactive video</topic> <topic>tracking methods</topic> <topic>web applications</topic> </subject> <classification authority="ddc">004</classification> <relatedItem type="host"> <genre authority="marcgt">periodical</genre> <genre>academic journal</genre> <titleInfo> <title>Journal of Virtual Reality and Broadcasting</title> </titleInfo> <part> <detail type="volume"> <number>11(2014)</number> </detail> <detail type="issue"> <number>9</number> </detail> <date>2014</date> </part> </relatedItem> <identifier type="issn">1860-2037</identifier> <identifier type="urn">urn:nbn:de:0009-6-40305</identifier> <identifier type="doi">10.20385/1860-2037/11.2014.9</identifier> <identifier type="uri">http://nbn-resolving.de/urn:nbn:de:0009-6-40305</identifier> <identifier type="citekey">bailer2014</identifier> </mods>Download
Full Metadata
Bibliographic Citation | JVRB, 11(2014), no. 9. |
---|---|
Title |
A user supported object tracking framework for interactive video production (eng) |
Author | Christian Bailer, Alain Pagani, Didier Stricker |
Language | eng |
Abstract | We present a user supported tracking framework that combines automatic tracking with extended user input to create error free tracking results that are suitable for interactive video production. The goal of our approach is to keep the necessary user input as small as possible. In our framework, the user can select between different tracking algorithms - existing ones and new ones that are described in this paper. Furthermore, the user can automatically fuse the results of different tracking algorithms with our robust fusion approach. The tracked object can be marked in more than one frame, which can significantly improve the tracking result. After tracking, the user can validate the results in an easy way, thanks to the support of a powerful interpolation technique. The tracking results are iteratively improved until the complete track has been found. After the iterative editing process the tracking result of each object is stored in an interactive video file that can be loaded by our player for interactive videos. |
Subject | data fusion, interactive tracking, interactive video, tracking methods, web applications |
DDC | 004 |
Rights | DPPL |
URN: | urn:nbn:de:0009-6-40305 |
DOI | https://doi.org/10.20385/1860-2037/11.2014.9 |