• Sida Zhang Information Engineering College of HAUST, Luoyang 471003, China
  • Wei Li Information Engineering College of HAUST, Luoyang 471003, China
  • Renfang Wang College of Big Data and Software Engineering, Zhejiang Wanli University, Ningbo 315100,China
Keywords: Video tracking, KCF algorithm, VGG-16 neural network, equal interval frame update


In order to solve the problem that the KCF tracking algo-rithm has occlusion or deformation and the disturbance fac-tors such as similar objects cause tracking failure; this paper proposes an improved algorithm combining VGG-16 neural network. Firstly, the VGG-16 network's powerful feature ex-traction capability is used to extract features that are more ro-bust to deformation and occlusion from different layers and different operations. Then, using the cyclic shift matrix of KCF algorithm, a large number of sample training classifiers are generated, and then new images are calculated. The filter-ing response of the block predicts the target position; in order to improve the real-time performance of the algorithm, the model and the new strategy for the KCF algorithm reduce the computational complexity by updating the model with a fixed frame interval. Compared with the traditional KCF algorithm, this method can effectively deal with the interference factors such as deformation and occlusion, and can achieve target tracking more quickly while ensuring accuracy.


Download data is not yet available.


[1] Wu Y, Lim J, Yang M H. Object Tracking Benchmark [J]. IEEE Transactions on Pattern Analysis & Machine Intelligence, 2015, 37(9):1834-1848.
[2] Yang L, Wu T, Zhu S C. Online Object Tracking, Learn-ing, and Parsing with And-Or Graphs[C]// IEEE Con-ference on Computer Vision & Pattern Recognition. 2014.
[3] Henriques J F , Caseiro R , Martins P , et al. High-Speed Tracking with Kernelized Correlation Filters[J]. IEEE Transactions on Pattern Analysis and Machine Intelli-gence, 2015, 37(3):583-596.
[4] Bolme D S , Beveridge J R , Draper B A , et al. Visual object tracking using adaptive correlation filters[C]// The Twenty-Third IEEE Conference on Computer Vi-sion and Pattern Recognition, CVPR 2010, San Fran-cisco, CA, USA, 13-18 June 2010. IEEE, 2010.
[5] Danelljan M , Khan F S , Felsberg M , et al. Adaptive Color Attributes for Real-Time Visual Tracking[C]// 2014 IEEE Conference on Computer Vision and Pattern Recognition. IEEE, 2014.
[6] Li Y , Zhu J . A Scale Adaptive Kernel Correlation Filter Tracker with Feature Integration[J]. 2014.
[7] Danelljan M, Hager G, Khan F S, et al. Discriminative Scale Space Tracking[J]. IEEE Transactions on Pattern Analysis & Machine Intelligence, 2017, 39(8):1561-1575.
[8] Danelljan M , H├Ąger, Gustav, Khan F S , et al. Learning Spatially Regularized Correlation Filters for Visual Tracking[J]. 2016.
[9] Galoogahi H K , Sim T , Lucey S . Correlation Filters with Limited Boundaries[J]. 2014.
[10] Danelljan M , Robinson A , Khan F S , et al. Beyond Correlation Filters: Learning Continuous Convolution Operators for Visual Tracking[C]// European Confer-ence on Computer Vision. Springer International Pub-lishing, 2016.
In the reference list, after numbering the bracket, each ref-erence will start with an indentation of 0.375".
[11] Wang L , Ouyang W , Wang X , et al. Visual Tracking with Fully Convolutional Networks[C]// 2015 IEEE In-ternational Conference on Computer Vision (ICCV). IEEE, 2016.
[12] Chen T, Guestrin C. XGBoost: A Scalable Tree Boost-ing System[C]// Acm Sigkdd International Conference on Knowledge Discovery & Data Mining. 2016.
[13] Changzhen X , Manqiang C , Runling W , et al. Real-time visual tracking algorithm based on correlation fil-ters and sparse convolutional features[J]. Journal of Computer Applications, 2018.
[14] Wu Y , Lim J , Yang M H . Online Object Tracking: A Benchmark[C]// 2013 IEEE Conference on Computer Vision and Pattern Recognition. IEEE Computer Soci-ety, 2013.
How to Cite
Zhang, S., Li, W., & Wang, R. (2019). KCF TRACKING ALGORITHM BASED ON VGG16 DEPTH FRAMEWORK. International Journal of Advanced Computer Technology, 8(2), 05-09. Retrieved from http://ijact.org/index.php/ijact/article/view/13