proximal methods for sparse hierarchical dictionary learning rodolphe jenatton, julien mairal,...

16
Proximal Methods for Sparse Hierarchical Dictionary Learning Rodolphe Jenatton, Julien Mairal, Guillaume Obozinski, Francis Bach Presented by Bo Chen, 2010, 6.11

Upload: spencer-hood

Post on 17-Dec-2015

218 views

Category:

Documents


1 download

TRANSCRIPT

Proximal Methods for Sparse Hierarchical Dictionary Learning

Rodolphe Jenatton, Julien Mairal, Guillaume Obozinski, Francis Bach

Presented by Bo Chen, 2010, 6.11

Outline

• 1. Structured Sparsity

• 2. Dictionary Learning

• 3. Sparse Hierarchical Dictionary Learning

• 4. Experimental Results

Structured Sparsity• Lasso (R. Tibshirani.,1996)

• Group Lasso (M. Yuan & Y. Lin, 2006)

• Tree-Guided Group Lasso (Kim & Xing, 2009)

Tree-Guided Structure Example

Tree Regularization Definition:

Kim & Xing, 2009

Multi-task:

Tree-Guided Structure PenaltyIntroduce two parameters:

Rewrite the penalty term, if the number of tasks is 2. (K=2):

Generally:

Kim & Xing, 2009

In Detail

Kim & Xing, 2009

Some Definitions about Hierarchical Groups

Hierarchical Sparsity-Inducing Norms

Dictionary Learning

If the structure information is introduced, the difference between dictionary learning and group lasso:

1. Group Lasso is a regression problem. Each feature has its own physical meaning. The structure information should be meaningful and correct. Otherwise, the ‘structure’ will hurt the method.

2. In dictionary learning, the dictionary is unknown. So the structure information will be a guide to help learn the structured dictionary.

Optimization• Proximal Operator for Structure Norm

Fix the dictionary D, the objective function:

=

Transformed to a proximal problem:

Proximal operator with the structure penalty:

Learning the DictionaryUpdating D 5 times in each iteration,

Updating A,

Experiments : Natural Image Patches

• Use the learned dictionary from training set to impute the missing values in testing samples. Each sample is a 8x8 patch.

• Training set: 50000; Testing set: 25000• Test 21 balanced tree structures of depth 3 and 4. Also

set the number of the nodes in each layer.

Learned Hierarchical Dictionary

Experiments : Text DocumentsKey points:

Visualization of NIPS proceedings

Documents: 1714Words: 8274

Postings ClassificationTraining set: 1000; Testing set: 425; Documents: 1425; Words:13312Goal: classify the postings from the two newsgroups, alt.atheism and talk.religion.misc.