Maria Brbic

is this you? claim profile

0 followers

  • Multi-view Low-rank Sparse Subspace Clustering

    Most existing approaches address multi-view subspace clustering problem by constructing the affinity matrix on each view separately and afterwards propose how to extend spectral clustering algorithm to handle multi-view data. This paper presents an approach to multi-view subspace clustering that learns a joint subspace representation by constructing affinity matrix shared among all views. Relying on the importance of both low-rank and sparsity constraints in the construction of the affinity matrix, we introduce the objective that balances between the agreement across different views, while at the same time encourages sparsity and low-rankness of the solution. Related low-rank and sparsity constrained optimization problem is for each view solved using the alternating direction method of multipliers. Furthermore, we extend our approach to cluster data drawn from nonlinear subspaces by solving the corresponding problem in a reproducing kernel Hilbert space. The proposed algorithm outperforms state-of-the-art multi-view subspace clustering algorithms on one synthetic and four real-world datasets.

    08/29/2017 ∙ by Maria Brbic, et al. ∙ 0 share

    read it

  • ℓ_0-Motivated Low-Rank Sparse Subspace Clustering

    In many applications, high-dimensional data points can be well represented by low-dimensional subspaces. To identify the subspaces, it is important to capture a global and local structure of the data which is achieved by imposing low-rank and sparseness constraints on the data representation matrix. In low-rank sparse subspace clustering (LRSSC), nuclear and ℓ_1 norms are used to measure rank and sparsity. However, the use of nuclear and ℓ_1 norms leads to an overpenalized problem and only approximates the original problem. In this paper, we propose two ℓ_0 quasi-norm based regularizations. First, the paper presents regularization based on multivariate generalization of minimax-concave penalty (GMC-LRSSC), which contains the global minimizers of ℓ_0 quasi-norm regularized objective. Afterward, we introduce the Schatten-0 (S_0) and ℓ_0 regularized objective and approximate the proximal map of the joint solution using a proximal average method (S_0/ℓ_0-LRSSC). The resulting nonconvex optimization problems are solved using alternating direction method of multipliers with established convergence conditions of both algorithms. Results obtained on synthetic and four real-world datasets show the effectiveness of GMC-LRSSC and S_0/ℓ_0-LRSSC when compared to state-of-the-art methods.

    12/17/2018 ∙ by Maria Brbic, et al. ∙ 0 share

    read it