Detail publikace

Distributed Visual Sensor Network Fusion

CHMELAŘ, P.; ZENDULKA, J. Distributed Visual Sensor Network Fusion. 4th Joint Workshop on Multimodal Interaction and Related Machine Learning Algorithms. Brno: 2007. s. 0-0.
Název anglicky
Distributed Visual Sensor Network Fusion
Typ
abstrakt
Jazyk
česky
Autoři
Chmelař Petr, Ing.
Zendulka Jaroslav, doc. Ing., CSc. (UIFS)
Klíčová slova

Visual sensor, distributednetwork, metadata management, moving objects, spatio-temporal data,Kalman filter, sensor fusion, object tracking, large area surveillancesystem.

Abstrakt

The poster deals with a framework fordistributed visual sensor network metadata management system. It is assumedthat data coming from many cameras is annotated using computer vision modulesto produce metadata representing moving objects in their states. The data issupposed to be noisy, uncertain and some states might be missing. Firstly, hereis described the spatio-temporal data cleaning using Kalman filter. Secondly,it copes with many visual sensors fusion and persistent object tracking within alarge area. Thirdly, it describes the data and architecture model.

Rok
2007
Strany
2
Kniha
4th Joint Workshop on Multimodal Interaction and Related Machine Learning Algorithms
Konference
Strojové učení a multimodální interakce, Brno, CZ
Místo
Brno
BibTeX
@misc{BUT192634,
  author="Petr {Chmelař} and Jaroslav {Zendulka}",
  title="Distributed Visual Sensor Network Fusion",
  booktitle="4th Joint Workshop on Multimodal Interaction and Related Machine Learning Algorithms",
  year="2007",
  pages="2",
  address="Brno",
  note="abstract"
}
Nahoru