Проста модель передбачення послідовностей на основі дендритної просторово-часової інтеграції

Recent experiments on dendritic spatiotemporal integration reveal the much bigger computational potential of a single neuron. An individual dendritic branch can work as a coincidence detector due to a dendritic spike initiated with locally spatially and temporally activated synapses. Here, we invest...

Full description

Saved in:
Bibliographic Details
Date:2018
Main Author: Osaulenko, Viacheslav M.
Format: Article
Language:English
Published: The National Technical University of Ukraine "Igor Sikorsky Kyiv Polytechnic Institute" 2018
Subjects:
Online Access:http://journal.iasa.kpi.ua/article/view/143211
Tags: Add Tag
No Tags, Be the first to tag this record!
Journal Title:System research and information technologies

Institution

System research and information technologies
Description
Summary:Recent experiments on dendritic spatiotemporal integration reveal the much bigger computational potential of a single neuron. An individual dendritic branch can work as a coincidence detector due to a dendritic spike initiated with locally spatially and temporally activated synapses. Here, we investigate a proposed idea that dendrites can perform temporal integration on behavior timescale ~1s, thus weakening simultaneous activation constraint. We construct the model of the recurrent neural network where each neuron activates not as a weighted summation of inputs, but due to their coincident activation both in space and time. We show that with using sparse distributed representation and tracking activity of the network in a certain time window it is possible to achieve a high capacity prediction system. We perform the theoretical analysis and estimate the capacity for the different parameters of the model where even the network with 100 neurons can store millions of sequences. Such a capacity results in a biologically unrealistic high number of synapses, much more than 100×100. However, this mechanism of tracking space-time coincidences in sparse activation can be realized in a limited biological neural network but still with a good sequence transition memory.