2016-01-20 Using scratch to illustrate the Flash-Lag Effect

Scratch (see https://scratch.mit.edu/) is a programming language aimed at introducing coding litteracy to schools and education. Yet you can implement even complex algorithms and games. It is visual, multi-platform and critically, open-source. Also, the web-site educates to sharing code and it is very easy to "fork" an existing project to change details or improve it. Openness at its best!

During a visit of a 14-year schoolboy at the lab, we used that to make a simple psychopysics experiment available at https://scratch.mit.edu/projects/92044597/ :

Read more…

2016-01-19 élasticité trames V1

L'installation Elasticité dynamique agit comme un filtre et génère de nouveaux espaces démultipliés, comme un empilement quasi infini d'horizons. Par principe de réflexion, la pièce absorbe l'image de l'environnement et accumule les points de vue ; le mouvement permanent requalifie continuellement ce qui est regardé et entendu.

On va maintenant utiliser des forces elastiques pour coordonner la dynamique des lames dans la trame.

Read more…

2016-01-18 bootstraping posts for élasticité

L'installation Elasticité dynamique agit comme un filtre et génère de nouveaux espaces démultipliés, comme un empilement quasi infini d'horizons. Par principe de réflexion, la pièce absorbe l'image de l'environnement et accumule les points de vue ; le mouvement permanent requalifie continuellement ce qui est regardé et entendu.

.. media:: http://vimeo.com/150813922

Ce meta-post gère la publication sur http://blog.invibe.net.

Read more…

2015-12-11 Reproducing Olshausen's classical SparseNet (part 3)

In this notebook, we test the convergence of SparseNet as a function of different learning parameters. This shows the relative robusteness of this method according to the coding parameters, but also the importance of homeostasis to obtain an efficient set of filters:

  • first, whatever the learning rate, the convergence is not complete without homeostasis,
  • second, we achieve better convergence for similar learning rates and on a certain range of learning rates for the homeostasis
  • third, the smoothing parameter alpha_homeo has to be properly set to achieve a good convergence.
  • last, this homeostatic rule works with the diferent variants of sparse coding.

See also :

Read more…

2015-12-11 Reproducing Olshausen's classical SparseNet (part 4)

In this notebook, we write an example script for the sklearn library showing the improvement in the convergence of dictionary learning induced by the introduction of Olshausen's homeostasis.

See also :

Read more…