Learning Successor Features with Distributed Hebbian Temporal Memory

Authors

Panov A. Kuderov P.

Annotation

This paper presents a novel approach to address the challenge of online temporal memory learning for decision-making under uncertainty in non-stationary, partially observable environments. The proposed algorithm, Distributed Hebbian Temporal Memory (DHTM), is based on factor graph formalism and a multicomponent neuron model. DHTM aims to capture sequential data relationships and make cumulative predictions about future observations, forming Successor Features (SF). Inspired by neurophysiological models of the neocortex, the algorithm utilizes distributed representations, sparse transition matrices, and local Hebbian-like learning rules to overcome the instability and slow learning process of traditional temporal memory algorithms like RNN and HMM. Experimental results demonstrate that DHTM outperforms LSTM and a biologically inspired HMM-like algorithm, CSCG, in the case of non-stationary datasets. Our findings suggest that DHTM is a promising approach for addressing the challenges of online sequence learning and planning in dynamic environments.

External links

Download the collection of thesis from the conference website (PDF, in Russian): https://disk.yandex.ru/i/MLASNwNNYSiGWQ

Download the earlier version of article from arXiv (PDF, in English): https://arxiv.org/abs/2310.13391

Reference link

Dzhivelikian, E., Kuderov, P., Pospelov, N., Panov, A. Learning Successor Features with Distributed Hebbian Temporal Memory // X International Conference on Cognitive Science: Thesis. Pyatigorsk, June 26–30, 2024. In two Parts. Part II. Pp. 106–107.