Abstract
Rank minimization has attracted a lot of attention due to its robustness in data recovery. To overcome the computational difficulty, rank is often replaced with nuclear norm. For several rank minimization problems, such a replacement has been theoretically proven to be valid, i.e., the solution to nuclear norm minimization problem is also the solution to rank minimization problem. Although it is easy to believe that such a replacement may not always be valid, no concrete example has ever been found. We argue that such a validity checking cannot be done by numerical computation and show, by analyzing the noiseless latent low rank representation (LatLRR) model, that even for very simple rank minimization problems the validity may still break down. As a by-product, we find that the solution to the nuclear norm minimization formulation of LatLRR is non-unique. Hence the results of LatLRR reported in the literature may be questionable.
Chapter PDF
Similar content being viewed by others
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
References
Wang, J.: Geometric Structure of High-Dimensional Data and Dimensionality Reduction. Springer (2012)
Candès, E., Recht, B.: Exact matrix completion via convex optimization. Foundations of Computational Mathematics 9(6), 717–772 (2009)
Candès, E., Li, X., Ma, Y., Wright, J.: Robust principal component analysis? Journal of the ACM 58(3), 1–37 (2009)
Candès, E.: Matrix completion with noise. Proceedings of the IEEE 98(6), 925–936 (2010)
Fazel, M.: Matrix Rank Minimization with Applications. PhD thesis, Standford University (2002), http://search.proquest.com/docview/305537461
Gross, D.: Recovering low-rank matrices from few coefficients in any basis. IEEE Transactions on Information Theory 57(3), 1548–1566 (2011)
Wright, J., Ganesh, A., Min, K., Ma, Y.: Compressive principal component pursuit. In: IEEE International Symposium on Information Theory Proceedings, pp. 1276–1280 (2012)
Waters, A.E., Sankaranarayanan, A.C., Baraniuk, R.G.: SpaRCS: Recovering low-rank and sparse matrices from compressive measurements. In: Advances in Neural Information Processing Systems, pp. 1089–1097 (2011)
Liu, Y.K.: Universal low-rank matrix recovery from pauli measurements. In: Advances in Neural Information Processing Systems, pp. 1638–1646 (2011)
Liu, G., Lin, Z., Yan, S., Sun, J., Ma, Y.: Robust recovery of subspace structures by low-rank representation. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(1), 171–184 (2013)
Liu, G., Lin, Z., Yu, Y.: Robust subspace segmentation by low-rank representation. In: International Conference on Machine Learning, vol. 3, pp. 663–670 (2010)
Peng, Y., Ganesh, A., Wright, J., Xu, W., Ma, Y.: RASL: Robust alignment by sparse and low-rank decomposition for linearly correlated images. In: IEEE Conference on Computer Vision and Pattern Recognition, vol. 34, pp. 2233–2246 (2010)
Zhang, Z., Ganesh, A., Liang, X., Ma, Y.: TILT: Transform invariant low-rank textures. International Journal of Computer Vision 99(1), 1–24 (2012)
Cheng, B., Liu, G., Huang, Z., Yan, S.: Multi-task low-rank affinities pursuit for image segmentation. In: IEEE International Conference on Computer Vision, pp. 2439–2446 (2011)
Lang, C., Liu, G., Yu, J., Yan, S.: Saliency detection by multi-task sparsity pursuit. IEEE Transactions on Image Processing 21(3), 1327–1338 (2012)
Liu, G., Xu, H., Yan, S.: Exact subspace segmentation and outlier detection by low-rank representation. In: International Conference on Artificial Intelligence and Statistics (2012)
Liu, G., Yan, S.: Latent low-rank representation for subspace segmentation and feature extraction. In: IEEE International Conference on Computer Vision, pp. 1615–1622 (2011)
Vidal, R.: Subspace clustering. IEEE Signal Processing Magazine 28(2), 52–68 (2011)
Elhamifar, E., Vidal, R.: Sparse subspace clustering. In: IEEE Conference on Computer Vision and Pattern Recognition, vol. 2, pp. 2790–2797 (2009)
Adler, A., Elad, M., Hel-Or, Y.: Probabilistic subspace clustering via sparse representations. IEEE Signal Processing Letters 20(1), 63–66 (2013)
Liu, R., Lin, Z., Torre, F.D.L., Su, Z.: Fixed-rank representation for unsupervised visual learning. In: IEEE Conference on Computer Vision and Pattern Recognition, pp. 598–605 (2012)
Friedland, S., Torokhti, A.: Generalized rank-constrained matrix approximations. SIAM Journal on Matrix Analysis and Applications 29(2), 656–659 (2007)
Cai, J., Candès, E., Shen, Z.: A singular value thresholding algorithm for matrix completion. SIAM Journal of Optimization 20(4), 1956–1982 (2010)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Zhang, H., Lin, Z., Zhang, C. (2013). A Counterexample for the Validity of Using Nuclear Norm as a Convex Surrogate of Rank. In: Blockeel, H., Kersting, K., Nijssen, S., Železný, F. (eds) Machine Learning and Knowledge Discovery in Databases. ECML PKDD 2013. Lecture Notes in Computer Science(), vol 8189. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-40991-2_15
Download citation
DOI: https://doi.org/10.1007/978-3-642-40991-2_15
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-40990-5
Online ISBN: 978-3-642-40991-2
eBook Packages: Computer ScienceComputer Science (R0)