Robust Subspace Tracking With Missing Data and Outliers: Novel Algorithm With Convergence Guarantee

IEEE Transactions on Signal Processing - Tập 69 - Trang 2070-2085 - 2021
Le Trung Thanh1,2, Nguyen Viet Dung1,3, Nguyen Linh Trung1, Karim Abed-Meraim4
1AVITECH Institute, University of Engineering and Technology, Vietnam National University, Hanoi, Vietnam
2PRISME Laboratory, University of Orléans, Orleans, France
3National Institute of Advanced Technologies of Brittany (ENSTA Bretagne), Brest, France
4PRISME Laboratory, University of Orleans, Orleans, France

Tóm tắt

In this article, we propose a novel algorithm, namely PETRELS-ADMM, to deal with subspace tracking in the presence of outliers and missing data. The proposed approach consists of two main stages: outlier rejection and subspace estimation. In the first stage, alternating direction method of multipliers (ADMM) is effectively exploited to detect outliers affecting the observed data. In the second stage, we propose an improved version of the parallel estimation and tracking by recursive least squares (PETRELS) algorithm to update the underlying subspace in the missing data context. We then present a theoretical convergence analysis of PETRELS-ADMM which shows that it generates a sequence of subspace solutions converging to the optimum of its batch counterpart. The effectiveness of the proposed algorithm, as compared to state-of-the-art algorithms, is illustrated on both simulated and real data.

Từ khóa

#Alternating direction method of multipliers (ADMM) #missing data #online robust PCA #outliers #robust matrix completion #robust subspace tracking

Tài liệu tham khảo

10.1007/s10915-018-0757-z

10.1137/140998135

10.1561/2400000003

10.1090/mcom/3388

xu, 0, ADMM without a fixed penalty parameter: Faster convergence with new adaptive penalization, Proc Adv Neural Inf Process Syst, 1267

10.1109/TAC.2014.2354892

10.1007/s10994-017-5628-6

feng, 0, Online robust PCA via stochastic optimization, Proc Adv Neural Inf Process Syst, 404

mairal, 2010, Online learning for matrix factorization and sparse coding, J Mach Learn Res, 11, 19

10.1137/1031049

10.1109/TSP.2017.2695457

10.1017/CBO9780511569920.003

10.1109/TSP.2018.2795560

10.1109/TCOMM.2019.2924885

10.1109/ALLERTON.2010.5706976

zhang, 0, Global convergence of a grassmannian gradient descent algorithm for subspace estimation, Proc Int Conf Art Intel Stat, 1460

he, 0, Incremental gradient on the grassmannian for online foreground and background separation in subsampled video, Proc IEEE Comput Soc Conf Comput Vis Pattern Recognit, 1568

10.1109/TSP.2013.2282910

10.1109/78.365290

10.1109/ICASSP.2015.7178735

10.1109/JSTSP.2018.2876626

10.1109/TIT.2005.864420

10.1109/5.58320

10.1109/7.78295

golub, 2012, Matrix Computations

10.1109/JSTSP.2018.2877405

10.1561/2200000016

10.1098/rsta.2015.0202

10.1109/MSP.2018.2826566

10.1109/JPROC.2018.2847041

10.1109/JPROC.2018.2853498

10.1109/TSP.2015.2417491

tulay, 2010, Adaptive Signal Processing Next Generation Solutions

10.1109/CVPRW.2012.6238919

10.1109/TIT.2018.2872023

10.1109/ACCESS.2019.2928130

10.1007/s00180-013-0435-4

10.1109/TSP.2019.2924595

powers, 2011, Evaluation: From precision, recall and f-measure to ROC, informedness, markedness and correlation, J Mach Learn Technol, 2, 37

gonen, 2016, Subspace learning with partial information, J Mach Learn Res, 17, 1821

van der vaart, 2000, Asymptotic Statistics

10.1109/TSP.2015.2449254

yi, 0, Fast algorithms for robust PCA via gradient descent, Proc Adv Neural Inf Process Syst, 4152

10.23919/EUSIPCO.2019.8903031

10.1137/110845768

giampouras, 2017, Online sparse and low-rank subspace learning from incomplete data: A Bayesian view, ” Signal Process, 137, 199, 10.1016/j.sigpro.2017.02.003