|2020/Feb/7th||Prof. Jun Tani||A Proposal of a Novel Variational Bayes Recurrent Neural Network Model Under Predictive Coding and Active Inference Frameworks
|OIST||Araya Inc, Arc Mori Building 24F|
The current talk introduces a novel variational Bayes RNN model which account for possible neuronal information processing mechanisms assumed in predictive coding and active inference frameworks. The model minimizes the free energy in terms of the weighted sum of two terms, the reconstruction error and the complexity. We examined how this weighting can affect development of the internal information processing of the network during its learning process of noisy fluctuated temporal patterns. The simulation results show that weak weighting of the complexity term in minimizing the free energy causes the development of deterministic chaos with strong top-down prior for imitating the randomness observed in target sequence patterns, while strong weighting of it causes the development of stochastic dynamics with weak top-down prior imitating probabilistic processes observed in the targets. Moreover, the results indicate that the most generalization in learning emerges between these two extremes. The talk concludes with implications in terms of the underlying neuronal mechanisms for the sense of agency as well as autism spectrum disorder from the analysis of recent robotics experiments using this model.
Ref. Ahmadi, A., & Tani, J. (2019). A Novel Predictive-Coding-Inspired Variational RNN Model for Online Prediction and Recognition. Neural Computation, 31, 2025–2074
Youtube URL: https://youtu.be/vnwAa2_PGqQ