Partially Camouflaged Object Tracking using Modified Probabilistic Neural Network and Fuzzy Energy based Active Contour

Article Type

Research Article

Publication Title

International Journal of Computer Vision

Abstract

Various problems in object detection and tracking have attracted researchers to develop methodologies for solving these problems. Occurrence of camouflage is one of such challenges that makes object detection and tracking problems more complex. However, less attention has been given to detect and track camouflaged objects due to complexity of the problem. In this article, we propose a tracking-by-detection algorithm to detect and track camouflaged objects. To increase separability between the camouflaged object and the background, we propose to integrate features (CIELab, histogram of orientation gradients and locally adaptive ternary pattern) from multi-cue (color, shape and texture) to represent a camouflaged object. A probabilistic neural network (PNN) is modified to construct an efficient discriminative appearance model for detecting camouflaged objects in video sequences. A large number of training patterns (many could be redundant) are reduced based on motion of the object in the modified PNN. The modified PNN makes the detection process faster and also increases the detection accuracy. Due to high visual similarity among the camouflaged object and the background, the boundary of camouflaged object is not well defined (i.e., boundary may be smooth and/or discontinuous). In this context, a robust fuzzy energy based active contour model using both global and local information is proposed to extract contour (boundary) of the detected camouflaged object for tracking. We show a realization of the proposed method and demonstrate its performance (both quantitatively and qualitatively) with respect to state-of-the-art techniques on several challenging sequences. Analysis of results concludes that the proposed technique can track camouflaged (fully or partially) objects as well as objects in various complex environments in a better way as compare to the existing ones.

First Page

116

Last Page

148

DOI

10.1007/s11263-016-0959-5

Publication Date

3-1-2017

This document is currently not available here.

Share

COinS