Classifying Tracked Objects in Far-Field Video Sequences
Speaker: Biswajit Bose , Vision Research Group, CSAIL
Classification of moving objects in outdoor video sequences is a challenging problem. The classification system can only be trained on labelled data collected from a small number of scenes, but needs to perform well under varying position, orientation and motion of objects with respect to the camera. Many features that are discriminative in a given scene (such as projected size and position of objects) tend to have scene-dependent distributions, and thus cannot be used for training a scene-invariant classifier that is expected to work well in arbitrary scenes. We present an object classification framework that initially uses scene-invariant features (to allow for good baseline performance in any scene) but provides a mechanism for later using scene-specific features and adapting a classifier to a given scene-specific distribution with the help of unlabelled data in that scene. Estimates of mutual information between object features and class labels across scenes are used to perform feature selection and grouping. Experimental results are demonstrated in the context of classification of moving objects---mostly vehicles and pedestrians---that are detected and tracked in far-field video sequences captured by static, uncalibrated cameras.