Learning Successor Features with Distributed Hebbian Temporal Memory

Authors

Panov A. Kuderov P.

Annotation

This paper presents a novel approach to address the challenge of online sequence learning for decision making under uncertainty in non-stationary, partially observable environments. The proposed algorithm, Distributed Hebbian Temporal Memory (DHTM), is based on the factor graph formalism and a multi-component neuron model. DHTM aims to capture sequential data relationships and make cumulative predictions about future observations, forming Successor Features (SFs). Inspired by neurophysiological models of the neocortex, the algorithm uses distributed representations, sparse transition matrices, and local Hebbian-like learning rules to overcome the instability and slow learning of traditional temporal memory algorithms such as RNN and HMM. Experimental results show that DHTM outperforms LSTM, RWKV and a biologically inspired HMM-like algorithm, CSCG, on non-stationary data sets. Our results suggest that DHTM is a promising approach to address the challenges of online sequence learning and planning in dynamic environments.

External links

DOI: 10.48550/arXiv.2310.13391

Download the article (PDF) from arXiv.org: https://arxiv.org/abs/2310.13391

Download the article (PDF) from the conference archive at OpenReview: https://openreview.net/forum?id=wYJII5BRYU

ResearchGate: https://www.researchgate.net/publication/394447251_Learning_Successor_Features_with_Distributed_Hebbian_Temporal_Memory

Reference link

Evgenii Dzhivelikian, Petr Kuderov, Aleksandr I. Panov. Learning Successor Features with Distributed Hebbian Temporal Memory // The Thirteenth International Conference on Learning Representations ICLR 2025, Singapore EXPO, Thu Apr 24 – Mon Apr 28th, 2025.