dc.description.abstract | Gait is a weak biometric when compared to face, fingerprint or iris because it can be easily
affected by various conditions. These are known as the covariate conditions and include clothing,
carrying, speed, shoes and view among others. In the presence of variable covariate conditions
gait recognition is a hard problem yet to be solved with no working system reported.
In this thesis, a novel gait representation, the Gait Flow Image (GFI), is proposed to extract
more discriminative information from a gait sequence. GFI extracts the relative motion of body
parts in different directions in separate motion descriptors. Compared to the existing model-free
gait representations, GFI is more discriminative and robust to changes in covariate conditions.
In this thesis, gait recognition approaches are evaluated without the assumption on cooperative
subjects, i.e. both the gallery and the probe sets consist of gait sequences under different
and unknown covariate conditions. The results indicate that the performance of the existing approaches
drops drastically under this more realistic set-up. It is argued that selecting the gait
features which are invariant to changes in covariate conditions is the key to developing a gait
recognition system without subject cooperation. To this end, the Gait Entropy Image (GEnI) is
proposed to perform automatic feature selection on each pair of gallery and probe gait sequences.
Moreover, an Adaptive Component and Discriminant Analysis is formulated which seamlessly
integrates the feature selection method with subspace analysis for fast and robust recognition.
Among various factors that affect the performance of gait recognition, change in viewpoint
poses the biggest problem and is treated separately. A novel approach to address this problem is
proposed in this thesis by using Gait Flow Image in a cross view gait recognition framework with
the view angle of a probe gait sequence unknown. A Gaussian Process classification technique
is formulated to estimate the view angle of each probe gait sequence. To measure the similarity
of gait sequences across view angles, the correlation of gait sequences from different views is
modelled using Canonical Correlation Analysis and the correlation strength is used as a similarity
measure. This differs from existing approaches, which reconstruct gait features in different views
through 2D view transformation or 3D calibration. Without explicit reconstruction, the proposed
method can cope with feature mis-match across view and is more robust against feature noise. | en_US |