Latency-based probabilistic information processing in a learning feedback hierarchy - ENSTA Paris - École nationale supérieure de techniques avancées Paris Accéder directement au contenu
Communication Dans Un Congrès Année : 2014

Latency-based probabilistic information processing in a learning feedback hierarchy

Résumé

In this article, we study a three-layer neural hierarchy composed of bi-directionally connected recurrent layers which is trained to perform a synthetic object recognition task. The main feature of this network is its ability to represent, transmit and fuse probabilistic information, and thus to take near-optimal decisions when inputs are contradictory, noisy or missing. This is achieved by a neural space-latency code which is a natural consequence of the simple recurrent dynamics in each layer. Furthermore, the network possesses a feedback mechanism that is compatible with the space-latency code by making use of the attractor properties of neural layers. We show that this feedback mechanism can resolve/correct ambiguities at lower levels. As the fusion of feedback information in each layer is achieved in a probabilistically coherent fashion, feedback only has an effect if low-level inputs are ambiguous.
Fichier principal
Vignette du fichier
gepperth2014latency.pdf (438.25 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-01098704 , version 1 (28-12-2014)

Identifiants

Citer

Alexander Gepperth. Latency-based probabilistic information processing in a learning feedback hierarchy. International Joint Conference on Neural Networks (IJCNN), Jun 2014, Beijing, China. pp.3031 - 3037, ⟨10.1109/IJCNN.2014.6889919⟩. ⟨hal-01098704⟩
56 Consultations
107 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More