Block and Group Regularized Sparse Modeling for Dictionary LearningYu-Tseh Chi, Mohsen Ali, Ajit Rajwade, and Jeffrey Ho.
Illustration of the proposed BGSC framework.
[Full Text (pdf) | Supplementary Material | Demo code in Matlab ]
AbstractThis paper proposes a dictionary learning framework that combines the proposed block/group (BGSC) or reconstructed block/group (R-BGSC) sparse coding schemes with the novel Intra-block Coherence Suppression Dictionary Learning (ICS-DL) algorithm. An important and distinguishing feature of the proposed framework is that all dictionary blocks are trained simultaneously with respect to each data group while the intra-block coherence being explicitly minimized as an important objective. We provide both empirical evidence and heuristic support for this feature that can be considered as a direct consequence of incorporating both the group structure for the input data and the block structure for the dictionary in the learning process. The optimization problems for both the dictionary learning and sparse coding can be solved efficiently using block-gradient descent, and the details of the optimization algorithms are presented. We evaluate the proposed methods on several classification (supervised) and clustering (unsupervised) problems using well-known datasets. Favorable comparisons with state-of-the-art dictionary learning methods demonstrate the viability and validity of the proposed framework.
1. Hand Written Digit Recognition
Visualization of coefficients of the training samples. Each "column" in the figure represents coefficients of 150 samples. Note that similar digits share more blocks (for example, 1, 7 and 9) than disimilar ones.