Posts about learning

2015-12-11 Reproducing Olshausen's classical SparseNet (part 3)

In this notebook, we test the convergence of SparseNet as a function of different learning parameters. This shows the relative robusteness of this method according to the coding parameters, but also the importance of homeostasis to obtain an efficient set of filters:

  • first, whatever the learning rate, the convergence is not complete without homeostasis,
  • second, we achieve better convergence for similar learning rates and on a certain range of learning rates for the homeostasis
  • third, the smoothing parameter alpha_homeo has to be properly set to achieve a good convergence.
  • last, this homeostatic rule works with the diferent variants of sparse coding.

See also :

Read more…

2015-12-11 Reproducing Olshausen's classical SparseNet (part 4)

In this notebook, we write an example script for the sklearn library showing the improvement in the convergence of dictionary learning induced by the introduction of Olshausen's homeostasis.

See also :

Read more…

2015-05-12 Extending Olshausens classical SparseNet

  • In a previous notebook, we tried to reproduce the learning strategy specified in the framework of the SparseNet algorithm from Bruno Olshausen. It allows to efficiently code natural image patches by constraining the code to be sparse. In particular, we saw that in order to optimize competition, it is important to control cooperation and we implemented a heuristic to just do this.

  • In this notebook, we provide an extension to the SparseNet algorithm. We will study how homeostasis (cooperation) may be an essential ingredient to this algorithm working on a winner-take-all basis (competition). This extension has been published as Perrinet, Neural Computation (2010) (see http://invibe.net/LaurentPerrinet/Publications/Perrinet10shl ):

bibtex
@article{Perrinet10shl,
    Title = {Role of homeostasis in learning sparse representations},
    Author = {Perrinet, Laurent U.},
    Journal = {Neural Computation},
    Year = {2010},
    Doi = {10.1162/neco.2010.05-08-795},
    Keywords = {Neural population coding, Unsupervised learning, Statistics of natural images, Simple cell receptive fields, Sparse Hebbian Learning, Adaptive Matching Pursuit, Cooperative Homeostasis, Competition-Optimized Matching Pursuit},
    Month = {July},
    Number = {7},
    Url = {http://invibe.net/LaurentPerrinet/Publications/Perrinet10shl},
    Volume = {22},
}

Read more…

2015-05-06 Reproducing Olshausens classical SparseNet (part 2)

  • In a previous notebook, we tried to reproduce the learning strategy specified in the framework of the SparseNet algorithm from Bruno Olshausen. It allows to efficiently code natural image patches by constraining the code to be sparse. We have shown that:

    • one can denoise an image using the learned dictionary
    • that the dictionaries are qualitatively the same,
    • that the efficiency is roughly similar but best when the learning and the coding are set to Orthogonal Matching Pursuit,
  • However, the dictionaries are qualitatively not the same as the one from the original paper, and this is certainly due to the lack of control in the competition during the learning phase.

  • Herein, we re-implement the cooperation mechanism in the dictionary learning routine - this will be then proposed to the main code.
  • the goal of this notebooks is to illustrate a PR to sklearn

Read more…

2015-05-05 Reproducing Olshausen's classical SparseNet (part 1)

  • This notebook tries to reproduce the learning strategy specified in the framework of the SparseNet algorithm from Bruno Olshausen. It allows to efficiently code natural image patches by constraining the code to be sparse.

  • the underlying machinery uses the dictionary learning used in the image denoising example from sklearn and our aim here is to show that a novel ingredient is necessary to reproduce Olshausen's results.

  • All these code bits are regrouped in the SHL scripts repository (where you will also find some older matlab code). You may install it using

    pip install git+https://github.com/bicv/SHL_scripts