Learning to Detect Event Sequences in Surveillance Streams at Very Low Frame Rate
Some camera surveillance systems are designed to be autonomous ¿ both from the energy and memory point of view. Autonomy allows operation in environments where wiring cameras for power and data transmission is neither feasible nor desirable. In these contexts, for cameras to work unattended over long periods requires choosing an adequately low frame rate to match the speed of the process to be supervised while minimizing energy and memory consumption. The result of surveillance is then a large stream of images acquired sparsely over time with limited visual continuity from one frame to the other. Reviewing these images to detect events of interest requires techniques that do not assume traceability of objects by visual similarity. When the process to be surveyed shows recurrent patterns of events over time, as it is often the case in industrial settings, other possibilities open up. Since images are time-stamped, this suggests techniques which use temporal data to help detecting relevant events. This contribution presents an image review tool that combines in cascade a scene change detector (SCD) with a temporal filter. The temporal filter learns to recognize relevant SCD events by their time distribution on the image stream. The learning phase is supported by image annotations provided by end-users during past reviews. The concept is tested on a benchmark of real surveillance images stemming from a nuclear safeguards context. Experimental results show that the combined SCD-temporal filter significantly reduces the workload necessary to detect safeguards-relevant events in large image streams.
LOMBARDI Paolo;
VERSINO Cristina;
2010-12-02
Springer-Verlag
JRC55098
978-0-85729-056-4,
1617-7916,
https://publications.jrc.ec.europa.eu/repository/handle/JRC55098,
10.1007/978-0-85729-057-1_5,
Additional supporting files
File name | Description | File type | |