Publication Details
Dynamic Tracking in Meeting Room Scenarios Using Omnidirectional View
Rigoll Gerhard, Prof. Dr.-Ing. (TUM)
Wallhoff Frank, Dipl.-Ing. (TUM)
Zobl Martin, Dipl.-Ing. (TUM)
computer vision, skin detection, omni-directional image, geometrical corrections, face detection, recognition.
The robust localization and tracking of faces in video streams is a fundamental concern for many subsequent multi-modal recognition approaches. Especially in meeting scenarios several independent processing queues often exist that use the position and gaze of faces, such as group action- and face recognizers. The costs for multiple camera recordings of meeting scenarios are obviously higher compared to those of a single omnidirectional camera setup. Therefore it would be desirable to use these easier to acquire omnidirectional recordings. The present work presents an implementation of a robust particle filter based face-tracker using omnidirectional views. It is shown how omnidirectional images have to be unwarped before they can be processed by localization and tracking systems being invented for undistorted material. The performance of the system is evaluated on a part of the PETS-ICVS 2003 Smart Meeting Room dataset.
@INPROCEEDINGS{FITPUB7438, author = "Igor Pot\'{u}\v{c}ek and Gerhard Rigoll and Frank Wallhoff and Martin Zobl", title = "Dynamic Tracking in Meeting Room Scenarios Using Omnidirectional View", pages = "933--936", booktitle = "17th International Conference on Pattern Recognition (ICPR 2004)", year = 2004, location = "Cambridge, GB", publisher = "IEEE Computer Society", ISBN = "0-7695-2128-2", language = "english", url = "https://www.fit.vut.cz/research/publication/7438" }