A computationally efficient tracker with direct appearance-kinematic measure and adaptive Kalman filter

Rami Ben-Ari, Ohad Ben-Shahar

Research output: Contribution to journalArticlepeer-review

4 Scopus citations

Abstract

Visual tracking is considered a common procedure in many real-time applications. Such systems are required to track objects under changes in illumination, dynamic viewing angle, image noise and occlusions (to name a few). But to maintain real-time performance despite these challenging conditions, tracking methods should require extremely low computational resources, therefore facing a trade-off between robustness and speed. Emergence of new consumer-level cameras capable of capturing video in 60 fps challenges this tradeoff even further. Unfortunately, state-of-the-art tracking techniques struggle to meet frame rates over 30 VGA-resolution fps with standard desktop power, let alone on typically-weaker mobile devices. In this paper we suggest a significantly cheaper computational method for tracking in colour video clips, that greatly improves tracking performance, in terms of robustness/speed trade-off. The suggested approach employs a novel similarity measure that explicitly combines appearance with object kinematics and a new adaptive Kalman filter extends the basic tracking to provide robustness to occlusions and noise. The linear time complexity of this method is reflected in computational efficiency and high processing rate. Comparisons with two recent trackers show superior tracking robustness at more than 5 times faster operation, all using naïve C/C++ implementation and built-in OpenCV functions.

Original languageEnglish
Pages (from-to)271-285
Number of pages15
JournalJournal of Real-Time Image Processing
Volume11
Issue number2
DOIs
StatePublished - 1 Feb 2016

ASJC Scopus subject areas

  • Information Systems

Fingerprint

Dive into the research topics of 'A computationally efficient tracker with direct appearance-kinematic measure and adaptive Kalman filter'. Together they form a unique fingerprint.

Cite this