Title: Images-to-images person ReID without temporal linking

Authors: Thuy-Binh Nguyen; Thi-Lan Le; Ngoc-Nam Pham

Addresses: School of Electronics and Telecommunications and International Research Institute MICA, Hanoi University of Science and Technology, Hanoi, Vietnam; Faculty of Electrical and Electronics, University of Transport and Communications, Hanoi, Vietnam ' International Research Institute MICA, Hanoi University of Science and Technology, 1 Dai Co Viet Street, Hai Ba Trung, Hanoi, Vietnam ' School of Electronics and Telecommunications, Hanoi University of Science and Technology, 1 Dai Co Viet Street, Hai Ba Trung, Hanoi, Vietnam

Abstract: This paper addresses images-to-images person re-identification (ReID) in which there are multiple images for each individual on both gallery and probe. Most existing approaches that try to extract/learn features require temporal linking between frames. This paper proposes a novel framework to overcome this requirement by formulating images-to-images person re-identification as a fusion function of image-to-images. First, a ranked list of candidates corresponding to each query image is determined. Then, these lists are fused to determine the matched person. The contributions of the paper are two-fold: 1) an extra feature [Gaussian of Gaussian (GOG)] is used for representing person; 2) new images-to images scheme that does not require temporal linking and features the benefit of image-to-images scheme is proposed. Extensive experiments on CAVIAR4REID (cases A and B) and RAiD datasets prove the effectiveness of the framework. The proposed scheme obtains +20.88%, +10.23%, and +10.39% improvement in rank-1 over image-to-images scheme on these datasets.

Keywords: multi-shot; person ReID; feature fusion; images-to-images person ReID.

DOI: 10.1504/IJCVR.2019.098798

International Journal of Computational Vision and Robotics, 2019 Vol.9 No.2, pp.152 - 171

Received: 03 May 2018
Accepted: 11 Jul 2018

Published online: 02 Apr 2019 *

Full-text access for editors Full-text access for subscribers Purchase this article Comment on this article