Learning Successor Representations with Distributed Hebbian Temporal Memory


Панов А. И. Кудеров П. В.


This paper presents a novel approach to address the challenge of online hidden representation learning for decision-making under uncertainty in non-stationary, partially observable environments. The proposed algorithm, Distributed Hebbian Temporal Memory (DHTM), is based on factor graph formalism and a multi-component neuron model. DHTM aims to capture sequential data relationships and make cumulative predictions about future observations, forming Successor Representation (SR). Inspired by neurophysiological models of the neocortex, the algorithm utilizes distributed representations, sparse transition matrices, and local Hebbian-like learning rules to overcome the instability and slow learning process of traditional temporal memory algorithms like RNN and HMM. Experimental results demonstrate that DHTM outperforms classical LSTM and performs comparably to more advanced RNN-like algorithms, speeding up Temporal Difference learning for SR in changing environments. Additionally, we compare the SRs produced by DHTM to another biologically inspired HMM-like algorithm, CSCG. Our findings suggest that DHTM is a promising approach for addressing the challenges of online hidden representation learning in dynamic environments.

Внешние ссылки

DOI: 10.48550/arXiv.2310.13391

Скачать PDF из архива arXiv.org (англ.): https://arxiv.org/abs/2310.13391

Ссылка при цитировании

Evgenii Dzhivelikian, Petr Kuderov, Aleksandr I. Panov. Learning Successor Representations with Distributed Hebbian Temporal Memory // arXiv preprint arXiv:2310.13391, 2023.