Tetsu Matsukawa [English | Jpapanese] | |||
Research Works on Person Re-Identification |
|||
Hierarchical Gaussian Descriptors |
|||
![]() Resources [PAMI'19]
| |||
CNN Features Learned from Combination of Attributes |
|||
![]() Download
| |||
Discriminative Pooling of Convolutional Features |
|||
![]() Publication
|
Kernelized Cross-view Quadratic Discriminant Analysis
| In person re-identification, Keep It Simple and Straightforward MEtric (KISSME) is known as a practical distance metric learning method. Typically, kernelization improves the performance of metric learning methods. Nevertheless, deriving KISSME on a reproducing kernel Hilbert space is a non-trivial problem. Nyström method approximates the Hilbert space in low-dimensional Euclidean space, and the application of KISSME is straightforward, yet it fails to preserve discriminative information. To utilize KISSME in a discriminative subspace of the Hilbert space, we propose a kernel extension of Cross-view Discriminant Analysis (XQDA) which learns a discriminative low-dimensional subspace, and simultaneously KISSME in the learned subspace. We show with the standard kernel trick, the kernelized XQDA results in the case when the empirical kernel vector is used as the input of XQDA. Experimental results on benchmark datasets show the kernelized XQDA outperforms XQDA and Nyström-KISSME. Download
Copyright (c) Tetsu Matsukawa, All Right Researved.
|