Result Details

Distributed Visual Sensor Network Fusion

CHMELAŘ, P.; ZENDULKA, J. Distributed Visual Sensor Network Fusion. 4th Joint Workshop on Multimodal Interaction and Related Machine Learning Algorithms. Brno: 2007. 2 s.
Type
abstract
Language
Czech
Authors
Chmelař Petr, Ing., DIFS (FIT)
Zendulka Jaroslav, doc. Ing., CSc., DIFS (FIT)
Abstract

The poster deals with a framework fordistributed visual sensor network metadata management system. It is assumedthat data coming from many cameras is annotated using computer vision modulesto produce metadata representing moving objects in their states. The data issupposed to be noisy, uncertain and some states might be missing. Firstly, hereis described the spatio-temporal data cleaning using Kalman filter. Secondly,it copes with many visual sensors fusion and persistent object tracking within alarge area. Thirdly, it describes the data and architecture model.

Keywords

Visual sensor, distributednetwork, metadata management, moving objects, spatio-temporal data,Kalman filter, sensor fusion, object tracking, large area surveillancesystem.

English keywords

Visual sensor, distributed network, metadata management, moving objects, spatio-temporal data, Kalman filter, sensor fusion, object tracking, large area surveillance system.

Published
2007
Pages
2
Book
4th Joint Workshop on Multimodal Interaction and Related Machine Learning Algorithms
Conference
Machine Learning and Multimodal Interaction
Place
Brno
BibTeX
@misc{BUT192634,
  author="Petr {Chmelař} and Jaroslav {Zendulka}",
  title="Distributed Visual Sensor Network Fusion",
  booktitle="4th Joint Workshop on Multimodal Interaction and Related Machine Learning Algorithms",
  year="2007",
  pages="2",
  address="Brno",
  note="Abstract"
}
Projects
Security-Oriented Research in Information Technology, MŠMT, Institucionální prostředky SR ČR (např. VZ, VC), MSM0021630528, start: 2007-01-01, end: 2013-12-31, running
Research groups
Departments
Back to top