Application of the volume learning algorithm artificial neural networks for recognition of the type of interaction between neurons from their cross-correlation histograms
An algorithm based on two types artificial neural networks (ANNs) is proposed. The first network is an associative ANN while the second network is a Self-Organizing Map of Kohonen. The results for a test set are similar to the performance of our pre-vious expert system algorithm developed with Group...
Saved in:
| Date: | 2005 |
|---|---|
| Main Authors: | , |
| Format: | Article |
| Language: | English |
| Published: |
Навчально-науковий комплекс "Інститут прикладного системного аналізу" НТУУ "КПІ" МОН та НАН України
2005
|
| Subjects: | |
| Online Access: | https://nasplib.isofts.kiev.ua/handle/123456789/14089 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Journal Title: | Digital Library of Periodicals of National Academy of Sciences of Ukraine |
| Cite this: | Application of the volume learning algorithm artificial neural networks for recognition of the type of interaction between neurons from their cross-correlation histograms / V.V. Kovalishyn, I.V. Tetko // Систем. дослідж. та інформ. технології. — 2005. — № 3. — С. 48-56. — Бібліогр.: 20 назв. — англ. |
Institution
Digital Library of Periodicals of National Academy of Sciences of Ukraine| _version_ | 1860254134596272128 |
|---|---|
| author | Kovalishyn, V.V. Tetko, I.V. |
| author_facet | Kovalishyn, V.V. Tetko, I.V. |
| citation_txt | Application of the volume learning algorithm artificial neural networks for recognition of the type of interaction between neurons from their cross-correlation histograms / V.V. Kovalishyn, I.V. Tetko // Систем. дослідж. та інформ. технології. — 2005. — № 3. — С. 48-56. — Бібліогр.: 20 назв. — англ. |
| collection | DSpace DC |
| description | An algorithm based on two types artificial neural networks (ANNs) is proposed. The first network is an associative ANN while the second network is a Self-Organizing Map of Kohonen. The results for a test set are similar to the performance of our pre-vious expert system algorithm developed with Group Method of Data Handling (GMDH). However, while GMDH uses indices derived using the expert knowledge (and thus require considerable time and resources) the VLA process initial raw data.
Для решения задачи распознавания типов взаимодействия между нейронами предложен алгоритм, основанный на использовании двух типов искусственных нейронных сетей (ИНС). Первая сеть представляет собой ассоциативную ИНС, тогда как вторая — самоорганизующиеся карты Кохонена. Результаты, полученные для тестового набора данных, подобны результатам, найденным методом группового учета аргументов (МГУА). Однако новый подход использует только исходные данные, тогда как МГУА — производные индексов, полученные дополнительным анализом начальных индексов.
Для вирішення задачі розпізнавання типів взаємодії між нейронами запропоновано алгоритм, заснований на використанні двох типів штучних нейронних мереж (ШНМ). Перша мережа представляє собою асоціативну ШНМ, тоді як друга — карту Кохонена, що самоорганізується. Результати тестування на наборі даних подібні до результатів, отриманих методом групового врахування аргументів (МГВА). Однак новий підхід використовує тільки початкові дані, тоді як МГВА — похідні індексів, отримані додатковим аналізом початкових індексів.
|
| first_indexed | 2025-12-07T18:47:05Z |
| format | Article |
| fulltext |
V.V. Kovalishyn, I.V. Tetko, 2005
48 ISSN 1681–6048 System Research & Information Technologies, 2005 № 3
УДК 519.688
APPLICATION OF THE VOLUME LEARNING ALGORITHM
ARTIFICIAL NEURAL NETWORKS FOR RECOGNITION OF
THE TYPE OF INTERACTION BETWEEN NEURONS FROM
THEIR CROSS-CORRELATION HISTOGRAMS
V.V. KOVALISHYN, I.V. TETKO
An algorithm based on two types artificial neural networks (ANNs) is proposed. The
first network is an associative ANN while the second network is a Self-Organizing
Map of Kohonen. The results for a test set are similar to the performance of our pre-
vious expert system algorithm developed with Group Method of Data Handling
(GMDH). However, while GMDH uses indices derived using the expert knowledge
(and thus require considerable time and resources) the VLA process initial raw data.
INTRODUCTION
The problem of the neuron interaction type recognition is of great practical impor-
tance in neurophysiology. Many investigations show the presence of a close rela-
tion between various types of neurological diseases and functional disorders in
neuron interaction in comparison with a reference pattern. For example, a study of
Parkinson disease model in macaques showed a high oscillatory activity and a
high degree of correlation of the neurons in globus pallidus [1]. At the same time,
the cross-correlograms of healthy monkeys usually showed the absence of any
interaction between neurons. Note that the type of interaction between particular
neurons is determined not only by their physiology (e.g., by synaptic relations
between neurons) but also by the functional condition of the entire brain. So for
this reason, it may change depending on the state of the animal (such as sleep and
wakefulness, or accomplishment a certain task by the animal [2, 3]); sometimes, it
may change in a fraction of a second [4]. Examination of the interaction between
neurons helps us to better understand the functioning of the brain and get new
methods of treatment of the nervous system diseases. Thereby an analysis of a
cross-correlation histogram is one of the most applied methods to classify the
functional types of neuron interaction. Other more complex methods for estimat-
ing functional relations in pairs [5], triples [6], and arbitrary sets [7] of neurons
were also developed. However, the method based on analyzing cross-
correlograms remains one of the most frequently used methods all over the world
because the interpretation of the results obtained by this method is most evident
and simple.
The cross-correlation histogram is the empirical distributions of the impulses
time delay of one neuron comparatively to the impulses of another neuron plotted
in the time range of 0 to 500 ms at 1 ms interval. The cross-correlograms are plot-
ted for pairs of neurons and used to classify several basic types of neuron interac-
tion according to the shape and arrangement of the histogram peak [8]. The basic
interaction types include (a) the absence of interaction between neurons; (b) the
Application of the volume learning algorithm artificial neural networks for recognition …
Системні дослідження та інформаційні технології, 2005, № 3 49
presence of a common input; and (c) the presence of direct activating/inhibiting
connections between neurons. The strength and duration of a neuron interaction
can be estimated based on the shape of a peak.
In order to recognize type of neuron interaction, in our previous work we
used various parameters that characterize the shape of cross-correlograms [9]. The
choice of these parameters was done in collaboration with experts who analyzed
the data. Thus the experience of the experts was a priory incorporated in the
classification system in form of these parameters. The interesting question was if
a similar in performance classification system could be constructed without using
any a priory knowledge of an expert.
Recently a Volume Learning Algorithm (VLA) was proposed to study
quantitative structure-activity relationships (QSAR) in medicinal chemistry [10].
This algorithm was successfully applied to correlate tens of thousands of input
molecular parameters representing electrostatic and steric interactions of molecules
with biological activities of series of cannabimimetic aminoalkyindoles,
N-benzylpiperidine analogs, etc [10, 11, 12]. The VLA is a combination of
supervised and unsupervised neural networks. The algorithm defines clusters in
input parameter space using the Self-Organizing Map of Kohonen (SOM) [14]
and then uses the mean values of these clusters for the training of the ensemble of
the feed-forward back-propagation neural networks. This approach decreases the
number of input parameters required for neural network training and calculates
neural network models with high generalization ability.
In the current study we extend application of this algorithm for classification
of types of neuron interaction and demonstrate that its performance is comparable
with the expert system developed in our previous studies.
DATA
The data used in this study were previously described in [13]. Each interaction
type was presented as three-symbol code. The first symbol of the code described
the kind of neuronal interaction, while the second and third symbols denoted the
strength and duration of the interaction, respectively.
The first symbol of the code of a histogram was a letter; depending on the
kind of the interaction, it could be:
C — a common input,
E — an exciting input,
I — inhibition (suppression),
L — a «large» input, or
Z — a non typical code (i.e., the classification of the cross-correlogram was
not performed).
The second symbol of the code was a number showing the interaction
strength measured on the four-grade scale according to the ratio of the maximum
histogram value to the first and second confidence levels.
The second number could be
0 — the interaction was present, but the maximum histogram value was
below the first confidence level («a very weakly manifested type»),
1 — the maximum histogram value was at the first confidence level («a
weakly manifested type»),
V.V. Kovalishyn, I.V. Tetko
ISSN 1681–6048 System Research & Information Technologies, 2005 № 3 50
2 — the maximum histogram value was at the second confidence level («a
moderate type»), or
3 — the maximum histogram value was above the second confidence level
(«a distinct type»).
The third symbol of the code was a letter showing the duration of the
interaction:
1=−TA ms (a rare type);
2=−TB ms (a rare type);
20
~
2 TC <− ms;
50
~
20 TD <− ms;
150
~
50 TE <− ms;
200~150 TF <− ms;
400
~
200 TG <− ms (only for the L type);
400>−TH ms (only for L).
If there was no interaction between neurons (this was the most frequent type)
the code was marked the value 0.
The initial training data set included 3444 histograms recorded in the audito-
ry cortex of wild and mutated mice. There was also another set of 2666 histo-
grams recorded in a different set of experiments. This set was used to further test
performance of the developed method.
METHOD
The VLA method combines unsupervised and supervised neural network me-
thods. Here we only give a brief description of this algorithm and more details can
be found elsewhere [10, 11].
Artificial neural networks can be subdivided into two main categories. The
first category, unsupervised neural networks, such as Kohonen neural network
[14], realize training without the teacher [14, 15]. This means that the target val-
ues are considered to be not known or absent and neural network learning consists
of the detection and clustering of input samples according to some internal rela-
tionships among them. However, in practice, the user explicitly or implicitly is
interested in some particular clustering that is relevant to some target activity, i.e.
the clustering is always «target»-based. Such clustering is usually achieved by
pre-selection input parameters that are considered to be relevant to this target ac-
tivity. Basically, this pre-selection corresponds to introducing some weighting
scheme of input parameters, e.g. some parameters receive unit weight and are se-
lected for clustering and other receive zero weight and are not selected. Thus, pre-
selection introduces some metric in the space of input parameters and the perfor-
mance of unsupervised method to a great degree depends on the correct choice of
this metric.
The VLA uses the supervised algorithm to explicitly determine the relevant
clustering metric and importance of input parameters and to improve the cluster-
ing of unsupervised methods as shown below.
The supervised neural networks are used to calculate dependencies between
input and output variables. One of the most well known neural networks belong-
Application of the volume learning algorithm artificial neural networks for recognition …
Системні дослідження та інформаційні технології, 2005, № 3 51
ing to the second class are the feed forward neural networks (FFNNs) trained with
the back propagation algorithm [16, 17]. The application of FFNNs to a data set
with a large number of input parameters, e.g. the data points of cross-
correlograms, is complicated. Firstly, the speed of a neural network is low when
dealing with a large number of input parameters. Secondly, FFNN can have low
generalization ability due to the overfitting/overtraining problem, which becomes
more critical if the number of inputs increases. Presence of correlation effects be-
tween input variables could further impair the FFNNs generalization. This algo-
rithm could provide better performance if one would cluster the input parameters,
and provide a limited number of inputs for the neural network training. Of course,
such clustering should be performed using similarity measure corresponding to
the target activity of the supervised approach. The question is how to determine
measure of similarity for such clustering. The VLA clusters the input parameters
that have similar input-to-hidden layer weights of neural networks following their
training.
Thus, in VLA the supervised method is used to determine a metric for unsu-
pervised method (this im-
proves «target»-based clus-
tering) while the unsuper-
vised method is used to
decrease the number of
inputs for the supervised
algorithm (Fig. 1).
The clustering de-
creases dimension of input
space of parameters, in-
creases signal-to-noise ra-
tion and improves perfor-
mance of supervised
algorithms. Thus both su-
pervised and unsupervised
algorithms «collaborate»,
help one another and mu-
tually profit from such
«collaboration». In practice
the work of algorithm con-
sists of several iterations
that consistently improves
quality of clustering and
supervised learning of both
algorithms (Fig. 1).
On the first iteration
of algorithm, in the absent
of supervised learning re-
sults, the clustering of in-
puts is performed using
initial values. The unsuper-
vised learning was done using Kohonen neural networks. The SOM of Kohonen
Kohonen Net
Initialization of
parameters
Partition data
on clusters
?
Additional
clusterisation
Testing model
by ASNNs
Selection of
optimal model
Training model
by ASNNs
Prediction of the
test sets
Data set
Table of
weights
Set size
of map
yes
no
Fig. 1. Block scheme of the volume learning algorithm
V.V. Kovalishyn, I.V. Tetko
ISSN 1681–6048 System Research & Information Technologies, 2005 № 3 52
is a «self-organizing» system capable to solve the unsupervised problems. The
SOM represents a lattice of neurons with dimension of each neuron (weights of
SOM neuron) corresponding to the dimension of input cases. Staring with initial
random initialization of neurons the SOM automatically adapts itself in such way
that the similar input objects are associated with the topologically close neurons
in the map, i.e. physically located close to each other on the map.
The supervised learning was performed using our implementation of FFFN,
so-called Associative Neural Network (ASNN) [18]. This type of networks im-
proves prediction ability of FFNN by explicit correction of the bias of this me-
thod. The architecture of the ASNN was consisted of three-layers with five neu-
rons in the hidden layer. The number of output neurons corresponded to the
number of classes of neuronal interactions. For each interaction all output values
were zero except one that indicated the class of the sample. The bias neuron was
presented on the input and on the hidden layer. 200=M independent ASNNs
were trained and used to cluster input parameters.
RESULTS AND DISCUSSION
The statistics of the type frequencies is present in Tabl. 1. Since the types C0C,
C0D and C0E are very rare, the histograms of these types are united in one con-
ventional type C0. The conventional type E is formed similarly. Tabl. 2 shows the
formation and enumeration of the 12ype =tM types used to solve the recognition
problem.
T a b l e 1 . The frequency of histogram types for 3444=N histograms
Kind of interac-
tion
Interaction
strength
Interaction duration
C
2–20ms
D
20–50ms
E
<150ms
Common input
(C)
1588
0 4 10 20
1 212 128 206
2 223 138 302
3 97 61 159
Exciting input
(E)
56
0 – – –
1 10 20 1
2 4 13 –
3 – 1 –
«Large» input
(L)
8
0 – – –
1 – – 5
2 – – 3
3 – – –
The «no interaction» type – 1774 histograms
The initial data set of 3444 histograms was used to train the VLA while the
second set of 2666 histograms was used to test the received model.
The input and target values were scaled between 0.1 and 0.9 for network
training. As since all parameters of data were dependent among themselves, the
data were normalized on maximal and minimal values founded for all samples of
the data.
Application of the volume learning algorithm artificial neural networks for recognition …
Системні дослідження та інформаційні технології, 2005, № 3 53
T a b l e 2 . The neighboring table of 12 histogram types
Type 10
(1774)
Type C D E
C0 Type 0
(48)
C1
Type 1 Type 2 Type 3
C1C C1D C1E
(169) (269) (109)
C2
Type 4 Type 5 Type 6
C2C C2D C3E
(127) (395) (141)
C3
Type 7 Type 8 Type 9
C3C C3D C3E
(22) (188) (107)
E Type 11E**
(52)
The input parameters, 500=n , were divided by VLA into 22 clusters. Most
of clusters were relatively small (size of clusters were varied from 4 to 50 para-
meters), whereas one cluster consisted of 115 parameters.
In order to evaluate the stability of neural network recognition we used
scheme with decision rejection for various threshold values 0P (see Tabl. 3) pro-
posed in [9].
T a b l e 3 . The recognition LOO results for the training set according to the
scheme with rejection of making a decision for various threshold value 0P
Num. 0P rejS 100S 50S 00S rejSN (%)M (%)S (%)rejS
GMDH
1 0.00 0 2535 489 420 3444 81 74 0.00
2 0.25 167 2472 438 336 3277 82 75 4.85
3 0.45 354 2391 376 306 3090 83 77 10.28
4 0.50 499 2322 349 259 2945 85 79 14.49
5 0.55 574 2272 329 253 2870 85 79 16.67
6 0.75 841 2117 255 218 2603 86 81 23.95
7 1.00 1129 1960 165 180 2315 88 85 32.78
VLA
1 0.00 0 2263 598 583 3444 74 66 0.00
2 0.46 167 2213 552 512 3277 76 68 4.84
3 0.57 355 2154 498 437 3089 78 70 10.31
4 0.65 498 2107 439 400 2946 79 72 14.45
5 0.67 575 2071 413 385 2869 79 72 16.69
6 0.75 844 1978 306 316 2081 82 76 24.50
7 0.83 1153 1849 204 238 2291 85 81 33.47
VLA — volume learning algorithm; GMDH-algorithm of the group method
of data handling; LOO — line-one-out. Here N is the total number of histograms.
V.V. Kovalishyn, I.V. Tetko
ISSN 1681–6048 System Research & Information Technologies, 2005 № 3 54
In Tabl. 3, the following notation is used:
N is the total number of histograms;
rejS is the number of rejections to make a decision;
00S is the number of gross classification errors;
50S is the number of minor classification errors;
100S is the number of histograms classified correctly;
rejSN − is the number of observations that are classified;
(%)M is the «soft» estimate of recognition accuracy (in percent):
( )
rej
50100 5.0100
SN
SS
M
−
+
= ;
(%)S is the «rigid» estimate of the recognition accuracy (in percent):
rej
100100
SN
S
S
−
= ;
(%)rejO is the percentage of rejections:
N
S
O rej
rej
100
= .
We used two recognition accuracy types (soft and rigid) to differentiate
gross prediction errors from minor ones (an incorrect prediction of «neighboring»
type), which affect on the general pattern of the empirical distribution of neural
links over interaction types only insignificantly (the approximate symmetry of
errors for the pair of types under consideration is taken into account) [9]. As
neighboring types we mean interaction types of kind C (a common input) under
the condition that they differ by only one grade on the strength or duration scale.
For example, for the type C1C the neighboring types are C1D, C2C (Table 2)
whereas for the type C2D the neighboring types are C1D, C2C, C2E and C3D. It
will be observed that the type **E has no neighboring types.
Interesting results can be provided by an analysis of distribution of
recognition accuracy for various threshold values 0P
021 PVV <−
(Table 3). The condition for
a decision rejection can be written in the following form:
.
Here 1V and 2V are the two highest predictions of the neural network out-
puts. The threshold value 00 =P in Table 3 corresponded to the results received
according to the scheme without rejections of decisions.
In order to estimates the prediction quality of the VLA algorithm let us
compare it with results received GMDH algorithm. The analysis of results
received by GMDH showed that the percentage of rejections for threshold value
0P 5.00 =Pbetween 0.45 and 0.55 are slightly increased therefore could be
taken as an optimal one.
For the training data set (for the threshold value 5.00 =P ) the soft estimate
of recognition accuracy of GMDH was 85% and the rigid estimate of recognition
quality was 79% (the number of rejection of decisions was 499). For the very
Application of the volume learning algorithm artificial neural networks for recognition …
Системні дослідження та інформаційні технології, 2005, № 3 55
close number of rejected decision (499) VLA results were 79% soft and 72% the
rigid estimate of recognition quality. Also VLA results were similar to those cal-
culated by the GMDH for the threshold value 75.00 =P (see Table 3). A smaller
performance of ASNN compared to the GMDH method can be to some extent
explained by more severe validation procedure used in this algorithm. Indeed, for
the training of ASNN only 50% of initial samples from the training set (i.e.,
1722=N samples) where used to adjust neural network weights. The remaining
samples were used to determine stopping conditions for ASNN training and to
calculate LOO results (see for details refs 19, 20). At the same time the GMDH
results were calculated using 3443=N samples. Thus a difference in the imple-
mentation details of both methods could account for the apparent difference in
their performance for the training set.
In order to independently estimates the prediction quality of the algorithm,
VLA also was applied to test set of 2666 histograms. The performance of the
method for the test set was comparable to that for the training set: the soft
evaluation of prediction accuracy (for threshold value 61.00 =P ) was 79% and
the rigid estimate of recognition quality was 75% (the number of rejection of
decisions was 298). For the very similar number of rejected decisions (295)
GMDH results were 81% soft and 76% rigid classification performance.
These results demonstrate that VLA can be applied for recognizing the type
of interaction between two neurons. However, compared to the previous method,
the VLA uses raw parameters of cross-correlation histograms and it does not re-
quire calculation of additional indexes or expert knowledge while providing a
comparable accuracy.
CONCLUSION
We have introduced a fast, automatic system based on Volume Learning Algo-
rithm for recognizing of the interaction type between two neurons. The self-
organizing map clustered input parameters and ASNNs used their mean values to
correlate the type of analyzed activity with their cross-correlograms. This signifi-
cantly decreased the number of input parameters and made it possible to calculate
models with prediction ability similar to the GMDH – based approach developed
using expert knowledge.
Acknowledgment
This study was partially supported by INTAS-OPEN grant 97-0173 and
SNSF SCOPES 7IP 62620.
REFERENCES
1. Bergman H., Feingold A., Nini, A., Raz, A., Slovin, H., Abeles, M., and Vaadia, E. //
Trends in Neuroscciences, 1998. — 21, № 1. — P. 32–38.
2. Villa A.E.P.,Hyland B., Tetko I.V., and Najem A. Dynamical Cell Assemblies in the
Rat Auditory Cortex in a Reaction — Time Task // Biosystems, 1998. — 48. —
P. 269–277.
3. Villa A.E.P., Tetko I.V., Hyland B., and Najem A. Significance of Spatiotemporal Ac-
tivity Patterns among Rat Cortex Neurons in Performance of a Conditioned Task
// Proc. Nat. Acad.Sci. USA, 1999. — 96, № 3. — P. 1106–1111.
V.V. Kovalishyn, I.V. Tetko
ISSN 1681–6048 System Research & Information Technologies, 2005 № 3 56
4. Vaadia E. Haalman I., Abeles M., Bergman H, Prut Y., Slovin H., and Aertsen A. //
Nature. — 1995. — 373. — P. 515–518.
5. Borisyuk G.N., Borisyuk R.M., Kirillov A.B., Kovalenko E.I., and Kryukov V.I. // Bi-
ol. Cybern.. — 1985. — 52. — P. 301–306.
6. Prut Y., Vaadia E., Bergman H., Haalman I., Slovin H., and Abeles M. Spatiotem-
poral Structure of Cortical Activity, Properties and Behavioral Relevance // J.
Neurophysiol. — 1998. — 79. — P. 2857–2874.
7. Tetko I.V. and Villa A.E.P. Fast Combinatorial Methods to Estimate the Probability
of Complex Temporal Patterns of Spikes // Biol. Cybern. — 1997. — 76. —
P. 397–4
8. Perkel D.H., Gerstein G.L., and Moore G.P. // Biophys. J. — 1967. — 7,
№ 4. — P. 391–418.
07.
9. Villa A.E.P., Tetko I.V., Ivakhnenko A.G., Ivakhnenko G.A., Sarychev A.P. Recogni-
tion of the type of Interaction between Neurons from their cross-correlation his-
tograms with use of the voting procedure // Pattern Recognition and Image Anal-
ysis. — 2001. — 11, № 4. — P. 743–750.
10. Tetko I.V., Kovalishyn V.V., Livingstone D.J. Volume Learning Algorithm Artificial
Neural Networks for 3D QSAR studies // J. Med. Chem. — 2001. — 44. —
P. 2411–2420.
11. Kovalishyn V.V, Tetko I.V., Luik A.I., Chretien J.R., and Livingstone D.J. Applica-
tion of Neural Networks Using the Volume Learning Algorithm for Quantitative
Study of the Three-Dimensional Structure–Activity Relationships of Chemical
Compounds // Russian Journal of Bioorganic Chemistry — 2001. — 27, № 4. —
P. 267–277.
12. Tetko I.V., Kovalishyn V.V., Luik A.I., Livingstone D.J. Application of Volume
Learning Artificial Neural Network to Calculate 3D QSAR Models with En-
hanced Predictive Properties // In: Rational Approaches to Drug Design. Eds.
H.-D. Hoeltje and W. Sippl. I — Barcelona: Prous Science. — 2001. — P. 229–234.
13. Villa A.E.P., Tetko I.V., Dutoit P., Ribaupierre Y.D., Ribaupierre F.D. Corticofugal
Modulation of Functional Connectivity within the Auditory Thalamus of Rat,
Guinea Pig and Cat Revealed by Cooling Deactivation // Journal of Neuroscience
Methods. — 1999. — № 86. — P. 161–178.
14. Kohonen T. Self-organisation Maps; Springer-Verlag: Berlin,1995. — 401 p.
15. Simon V., Gasteiger J., Zupan J. A Combined Application of Two Different Neural
Network Types for the Prediction of Chemical Reactivity // J. Am. Chem.
Soc. — 1993. — 115. — P. 9148–9159.
16. Zupan J., Gasteiger J. Neural Networks for Chemistry and Drug Design: An
Introduction; 2nd
17. Tetko I.V., Luik A.I., Poda G.I. Application of Neural Networks in Structure-Activity
Relationships of a Small Number of Molecules // J. Med. Chem. — 1993. —
36. — P. 811–814.
edition, VCH: Weinheim. — 1999. — 380 p.
18. Tetko I.V. Neural Network Studies. 4. Introduction to Associative Neural Networks,
J. Chem. Inf. Comput. Sci. — 2002. — 42. — P. 717–728.
19. Tetko I.V., Livingstone D. J. and Luik A.I. Neural Network Studies. 1. Comparison of
Overfitting and Overtraining // J. Chem. Inf. Comput. Sci. 1995. — 35. —
P. 826–833.
20. Tetko I.V., Villa A.E.P. Efficient Partition of Learning Data Sets for Neural Network
Training // Neural Networks 1997. — 10. — P. 1361–1374.
Received 07.06.2004
From the Editorial Board: The article corresponds completely to submitted manu-
script.
Application of the Volume Learning Algorithm Artificial Neural Networks for Recognition of the Type of Interaction between Neurons from their Cross-Correlation Histograms
V.V. Kovalishyn, I.V. Tetko
INTRODUCTION
DATA
METHOD
RESULTS AND DISCUSSION
Conclusion
Acknowledgment
Fig. 1. Block scheme of the volume learning algorithm
|
| id | nasplib_isofts_kiev_ua-123456789-14089 |
| institution | Digital Library of Periodicals of National Academy of Sciences of Ukraine |
| issn | 1681–6048 |
| language | English |
| last_indexed | 2025-12-07T18:47:05Z |
| publishDate | 2005 |
| publisher | Навчально-науковий комплекс "Інститут прикладного системного аналізу" НТУУ "КПІ" МОН та НАН України |
| record_format | dspace |
| spelling | Kovalishyn, V.V. Tetko, I.V. 2010-12-13T16:23:45Z 2010-12-13T16:23:45Z 2005 Application of the volume learning algorithm artificial neural networks for recognition of the type of interaction between neurons from their cross-correlation histograms / V.V. Kovalishyn, I.V. Tetko // Систем. дослідж. та інформ. технології. — 2005. — № 3. — С. 48-56. — Бібліогр.: 20 назв. — англ. 1681–6048 https://nasplib.isofts.kiev.ua/handle/123456789/14089 519.688 An algorithm based on two types artificial neural networks (ANNs) is proposed. The first network is an associative ANN while the second network is a Self-Organizing Map of Kohonen. The results for a test set are similar to the performance of our pre-vious expert system algorithm developed with Group Method of Data Handling (GMDH). However, while GMDH uses indices derived using the expert knowledge (and thus require considerable time and resources) the VLA process initial raw data. Для решения задачи распознавания типов взаимодействия между нейронами предложен алгоритм, основанный на использовании двух типов искусственных нейронных сетей (ИНС). Первая сеть представляет собой ассоциативную ИНС, тогда как вторая — самоорганизующиеся карты Кохонена. Результаты, полученные для тестового набора данных, подобны результатам, найденным методом группового учета аргументов (МГУА). Однако новый подход использует только исходные данные, тогда как МГУА — производные индексов, полученные дополнительным анализом начальных индексов. Для вирішення задачі розпізнавання типів взаємодії між нейронами запропоновано алгоритм, заснований на використанні двох типів штучних нейронних мереж (ШНМ). Перша мережа представляє собою асоціативну ШНМ, тоді як друга — карту Кохонена, що самоорганізується. Результати тестування на наборі даних подібні до результатів, отриманих методом групового врахування аргументів (МГВА). Однак новий підхід використовує тільки початкові дані, тоді як МГВА — похідні індексів, отримані додатковим аналізом початкових індексів. en Навчально-науковий комплекс "Інститут прикладного системного аналізу" НТУУ "КПІ" МОН та НАН України Прогресивні інформаційні технології, високопродуктивні комп’ютерні системи Application of the volume learning algorithm artificial neural networks for recognition of the type of interaction between neurons from their cross-correlation histograms Применение алгоритма пространственного обучения искусственных нейронных сетей для распознавания типа взаимодействия нейронов по их кросскорреляционной гистограмме Використання алгоритму просторового навчання штучних нейронних мереж для розпізнавання типу взаємодії нейронів по їх кроскореляційній гістограмі Article published earlier |
| spellingShingle | Application of the volume learning algorithm artificial neural networks for recognition of the type of interaction between neurons from their cross-correlation histograms Kovalishyn, V.V. Tetko, I.V. Прогресивні інформаційні технології, високопродуктивні комп’ютерні системи |
| title | Application of the volume learning algorithm artificial neural networks for recognition of the type of interaction between neurons from their cross-correlation histograms |
| title_alt | Применение алгоритма пространственного обучения искусственных нейронных сетей для распознавания типа взаимодействия нейронов по их кросскорреляционной гистограмме Використання алгоритму просторового навчання штучних нейронних мереж для розпізнавання типу взаємодії нейронів по їх кроскореляційній гістограмі |
| title_full | Application of the volume learning algorithm artificial neural networks for recognition of the type of interaction between neurons from their cross-correlation histograms |
| title_fullStr | Application of the volume learning algorithm artificial neural networks for recognition of the type of interaction between neurons from their cross-correlation histograms |
| title_full_unstemmed | Application of the volume learning algorithm artificial neural networks for recognition of the type of interaction between neurons from their cross-correlation histograms |
| title_short | Application of the volume learning algorithm artificial neural networks for recognition of the type of interaction between neurons from their cross-correlation histograms |
| title_sort | application of the volume learning algorithm artificial neural networks for recognition of the type of interaction between neurons from their cross-correlation histograms |
| topic | Прогресивні інформаційні технології, високопродуктивні комп’ютерні системи |
| topic_facet | Прогресивні інформаційні технології, високопродуктивні комп’ютерні системи |
| url | https://nasplib.isofts.kiev.ua/handle/123456789/14089 |
| work_keys_str_mv | AT kovalishynvv applicationofthevolumelearningalgorithmartificialneuralnetworksforrecognitionofthetypeofinteractionbetweenneuronsfromtheircrosscorrelationhistograms AT tetkoiv applicationofthevolumelearningalgorithmartificialneuralnetworksforrecognitionofthetypeofinteractionbetweenneuronsfromtheircrosscorrelationhistograms AT kovalishynvv primeneniealgoritmaprostranstvennogoobučeniâiskusstvennyhneironnyhseteidlâraspoznavaniâtipavzaimodeistviâneironovpoihkrosskorrelâcionnoigistogramme AT tetkoiv primeneniealgoritmaprostranstvennogoobučeniâiskusstvennyhneironnyhseteidlâraspoznavaniâtipavzaimodeistviâneironovpoihkrosskorrelâcionnoigistogramme AT kovalishynvv vikoristannâalgoritmuprostorovogonavčannâštučnihneironnihmereždlârozpíznavannâtipuvzaêmodííneironívpoíhkroskorelâcíiníigístogramí AT tetkoiv vikoristannâalgoritmuprostorovogonavčannâštučnihneironnihmereždlârozpíznavannâtipuvzaêmodííneironívpoíhkroskorelâcíiníigístogramí |