Constructing task visibility intervals for a surveillance system
2005, Proceedings of the third ACM international workshop on Video surveillance & sensor networks - VSSN '05
https://doi.org/10.1145/1099396.1099421Abstract
Vision systems are increasingly being deployed to perform complex surveillance tasks. While improved algorithms are being developed to perform these tasks, it is also important that data suitable for these algorithms be acquired -a non-trivial task in a dynamic and crowded scene viewed by multiple PTZ cameras. In this paper, we describe a real-time multicamera system that collects images and videos of moving objects in such scenes, subject to task constraints. The system constructs "task visibility intervals" that contain information about what can be sensed in future time intervals. Constructing these intervals requires prediction of future object motion and consideration of several factors such as object occlusion and camera control parameters. Such intervals can also be combined to form multi-task intervals, during which a single camera can collect videos suitable for multiple tasks simultaneously. Experimental results are provided to illustrate the system capabilities in constructing such task visibility intervals, followed by scheduling them using a greedy algorithm.
References (14)
- Kalman, Rudolph, and Emil, "A new approach to linear filtering and prediction problems," Transactions of the ASME -Journal of Basic Engineering, vol. 82, no. Series D, pp. 35-45, 1960.
- K.A. Tarabanis, P.K. Allen, and R.Y. Tsai, "A survey of sensor planning in computer vision," IEEE Transactions on Robotics and Automation, vol. 11, no. 1, pp. 86-104, 1995.
- Cregg K. Cowan and Peter D. Kovesi, "Automatic sensor placement from vision task requirement," IEEE Transactions on Pattern Analysis and machine intelligence, vol. 10, no. 3, pp. 407-416, 1988.
- I. Stamos and P. Allen, "Interactive sensor planning," in Computer Vision and Pattern Recognition Conference, Jun 1998, pp. 489-494.
- Steven Abrams, Peter K. Allen, and Konstantinos A. Tarabanis, "Dynamic sensor planning.," in ICRA (2), 1993, pp. 605-610.
- Steven Abrams, Peter K. Allen, and Konstantinos Tarabanis, "Computing camera viewpoints in an active robot work cell," International Journal of Robotics Research, vol. 18, no. 2, February 1999.
- K.A. Tarabanis, R.Y. Tsai, and P.K. Allen, "The mvp sensor planning system for robotic vision tasks," IEEE Transactions on Robotics and Automation, vol. 11, no. 1, pp. 72-85, February 1995.
- Anurag Mittal and Larry S. Davis, "Visibility analysis and sensor planning in dynamic environments," in European Conference on Computer Vision, May 2004.
- K.N. Kutulakos and C. R. Dyer, "Global surface reconstruction by purposive control of observer motion," in IEEE Conference on Computer Vision and Pattern Recognition, Seattle, Washington, USA, June 1994.
- K.N. Kutulakos and C. R. Dyer, "Occluding contour detection using affine invariants and purposive viewpoint control," in IEEE Conference on Computer Vision and Pattern Recognition, Seattle, Washing- ton, USA, June 1994.
- K.N. Kutulakos and C. R. Dyer, "Recovering shape by purposive viewpoint adjustment," International Journal of Computer Vision, vol. 12, no. 2, pp. 113-136, 1994.
- W.E.L.Grimson and C.Stauffer, "Adaptive background mixture models for real-time tracking," in IEEE Conference on Computer Vision and Pattern Recognition, 1999.
- Michael Isard and Andrew Blake, "Condensation -conditional density propagation for visual tracking," International journal of computer vision, vol. 29, no. 1, pp. 5-28, 1998.
- M. de Berg, M. van Kreveld, M. Overmars, and O. Schwarzkopf, Computational Geometry, Springer, 1997.