Neural network synthesis based on evolutionary optimization

The evolutionary approach for neural network structural synthesis is considered in this paper. The new method of multimodal evolutionary search with a chromosome clustering is offered. The developed method is based on the idea of simultaneous search of several optimums, thus chromosomes are grouped...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Системні дослідження та інформаційні технології
Datum:2015
Hauptverfasser: Oliinyk, A.A., Subbotin, S.A.
Format: Artikel
Sprache:English
Veröffentlicht: Навчально-науковий комплекс "Інститут прикладного системного аналізу" НТУУ "КПІ" МОН та НАН України 2015
Schlagworte:
Online Zugang:https://nasplib.isofts.kiev.ua/handle/123456789/86133
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Назва журналу:Digital Library of Periodicals of National Academy of Sciences of Ukraine
Zitieren:Neural network synthesis based on evolutionary optimization / A.A. Oliinyk, S.A. Subbotin // Системні дослідження та інформаційні технології. — 2015. — № 1. — С. 77-86. — Бібліогр.: 14 назв. — англ.

Institution

Digital Library of Periodicals of National Academy of Sciences of Ukraine
id nasplib_isofts_kiev_ua-123456789-86133
record_format dspace
spelling Oliinyk, A.A.
Subbotin, S.A.
2015-09-08T11:07:33Z
2015-09-08T11:07:33Z
2015
Neural network synthesis based on evolutionary optimization / A.A. Oliinyk, S.A. Subbotin // Системні дослідження та інформаційні технології. — 2015. — № 1. — С. 77-86. — Бібліогр.: 14 назв. — англ.
1681–6048
https://nasplib.isofts.kiev.ua/handle/123456789/86133
004.93
The evolutionary approach for neural network structural synthesis is considered in this paper. The new method of multimodal evolutionary search with a chromosome clustering is offered. The developed method is based on the idea of simultaneous search of several optimums, thus chromosomes are grouped in clusters on their arrangement in a search space. So stable subpopulations in different clusters are formed, diversity of search is provided, and convergence to different local minima is reached that allows to find closer to optimal architectures of neural networks. Software implementing proposed method is developed. The experiments with proposed method in practical problem solving were conducted.
У статті розглянуто еволюційний підхід для структурного синтезу нейронних мереж. Запропоновано новий метод мультимодального еволюційного пошуку з кластеризацією хромосом. Розроблений метод заснований на ідеї одночасного пошуку декількох оптимумів, при якому хромосоми групуються у кластери за їхнім розташуванням у просторі пошуку. Таким чином формуються стабільні субпопуляції в різних кластерах, забезпечується різноманітність пошуку і досягається збіжність до різних локальних мінімумів , що дозволяє знайти архітектуру нейронної мережі, близьку до оптимальної. Розроблено програмне забезпечення, що реалізує запропонований метод, а також проведено експерименти з його дослідження при вирішенні практичних завдань.
В статье рассмотрен эволюционный подход для структурного синтеза нейронных сетей. Предложен новый метод мультимодального эволюционного поиска с кластеризацией хромосом. Разработанный метод основан на идее одновременного поиска нескольких оптимумов, при котором хромосомы группируются в кластеры по их расположению в пространстве поиска. Таким образом формируются стабильные субпопуляции в различных кластерах, обеспечивается разнообразие поиска и достигается сходимость к различным локальным минимумам, что позволяет найти архитектуру нейронной сети, близкую к оптимальной. Разработано программное обеспечение, реализующее предложенный метод, а также проведены эксперименты по его исследованию при решении практических задач.
en
Навчально-науковий комплекс "Інститут прикладного системного аналізу" НТУУ "КПІ" МОН та НАН України
Системні дослідження та інформаційні технології
Проблемно і функціонально орієнтовані комп’ютерні системи та мережі
Neural network synthesis based on evolutionary optimization
Синтез нейронних мереж на основі еволюційної оптимізації
Синтез нейронных сетей на основе эволюционной оптимизации
Article
published earlier
institution Digital Library of Periodicals of National Academy of Sciences of Ukraine
collection DSpace DC
title Neural network synthesis based on evolutionary optimization
spellingShingle Neural network synthesis based on evolutionary optimization
Oliinyk, A.A.
Subbotin, S.A.
Проблемно і функціонально орієнтовані комп’ютерні системи та мережі
title_short Neural network synthesis based on evolutionary optimization
title_full Neural network synthesis based on evolutionary optimization
title_fullStr Neural network synthesis based on evolutionary optimization
title_full_unstemmed Neural network synthesis based on evolutionary optimization
title_sort neural network synthesis based on evolutionary optimization
author Oliinyk, A.A.
Subbotin, S.A.
author_facet Oliinyk, A.A.
Subbotin, S.A.
topic Проблемно і функціонально орієнтовані комп’ютерні системи та мережі
topic_facet Проблемно і функціонально орієнтовані комп’ютерні системи та мережі
publishDate 2015
language English
container_title Системні дослідження та інформаційні технології
publisher Навчально-науковий комплекс "Інститут прикладного системного аналізу" НТУУ "КПІ" МОН та НАН України
format Article
title_alt Синтез нейронних мереж на основі еволюційної оптимізації
Синтез нейронных сетей на основе эволюционной оптимизации
description The evolutionary approach for neural network structural synthesis is considered in this paper. The new method of multimodal evolutionary search with a chromosome clustering is offered. The developed method is based on the idea of simultaneous search of several optimums, thus chromosomes are grouped in clusters on their arrangement in a search space. So stable subpopulations in different clusters are formed, diversity of search is provided, and convergence to different local minima is reached that allows to find closer to optimal architectures of neural networks. Software implementing proposed method is developed. The experiments with proposed method in practical problem solving were conducted. У статті розглянуто еволюційний підхід для структурного синтезу нейронних мереж. Запропоновано новий метод мультимодального еволюційного пошуку з кластеризацією хромосом. Розроблений метод заснований на ідеї одночасного пошуку декількох оптимумів, при якому хромосоми групуються у кластери за їхнім розташуванням у просторі пошуку. Таким чином формуються стабільні субпопуляції в різних кластерах, забезпечується різноманітність пошуку і досягається збіжність до різних локальних мінімумів , що дозволяє знайти архітектуру нейронної мережі, близьку до оптимальної. Розроблено програмне забезпечення, що реалізує запропонований метод, а також проведено експерименти з його дослідження при вирішенні практичних завдань. В статье рассмотрен эволюционный подход для структурного синтеза нейронных сетей. Предложен новый метод мультимодального эволюционного поиска с кластеризацией хромосом. Разработанный метод основан на идее одновременного поиска нескольких оптимумов, при котором хромосомы группируются в кластеры по их расположению в пространстве поиска. Таким образом формируются стабильные субпопуляции в различных кластерах, обеспечивается разнообразие поиска и достигается сходимость к различным локальным минимумам, что позволяет найти архитектуру нейронной сети, близкую к оптимальной. Разработано программное обеспечение, реализующее предложенный метод, а также проведены эксперименты по его исследованию при решении практических задач.
issn 1681–6048
url https://nasplib.isofts.kiev.ua/handle/123456789/86133
citation_txt Neural network synthesis based on evolutionary optimization / A.A. Oliinyk, S.A. Subbotin // Системні дослідження та інформаційні технології. — 2015. — № 1. — С. 77-86. — Бібліогр.: 14 назв. — англ.
work_keys_str_mv AT oliinykaa neuralnetworksynthesisbasedonevolutionaryoptimization
AT subbotinsa neuralnetworksynthesisbasedonevolutionaryoptimization
AT oliinykaa sintezneironnihmerežnaosnovíevolûcíinoíoptimízacíí
AT subbotinsa sintezneironnihmerežnaosnovíevolûcíinoíoptimízacíí
AT oliinykaa sintezneironnyhseteinaosnoveévolûcionnoioptimizacii
AT subbotinsa sintezneironnyhseteinaosnoveévolûcionnoioptimizacii
first_indexed 2025-11-27T02:12:24Z
last_indexed 2025-11-27T02:12:24Z
_version_ 1850793197874184192
fulltext © A.A. Oliinyk, S.A. Subbotin, 2015 Системні дослідження та інформаційні технології, 2015, № 1 77 UDC 004.93 NEURAL NETWORK SYNTHESIS BASED ON EVOLUTIONARY OPTIMIZATION A.A. OLIINYK, S.A. SUBBOTIN The evolutionary approach for neural network structural synthesis is considered in this paper. The new method of multimodal evolutionary search with a chromosome clustering is offered. The developed method is based on the idea of simultaneous search of several optimums, thus chromosomes are grouped in clusters on their arrangement in a search space. So stable subpopulations in different clusters are formed, diversity of search is provided, and convergence to different local minima is reached that allows to find closer to optimal architectures of neural networks. Soft- ware implementing proposed method is developed. The experiments with proposed method in practical problem solving were conducted. INTRODUCTION Nowadays neural networks, with their ability to derive meaning from complicated or imprecise data, are widely used to extract patterns and detect trends that are too complex to be noticed by either humans or other computer techniques [1, 2]. It is known that the architecture of a neural network determines its informa- tion processing capability [3]. So architecture design has become one of the most important tasks in neural network research and application. The architecture of a neural network includes its topological structure, the transfer and discriminant function of each node in the network. Before present times architecture design is still a human expert’s job. It de- pends heavily on the expert experience and a tedious trial-and-error process. There is no systematic way to design a near-optimal architecture for a given task automatically. The synthesis of neural network is concerned with the optimization of some criterion like a sum of squared error. However, solving of this optimization task is engaged with problems caused by high dimension of training sample, multiextre- meness of criterion function, nondifferentiability of activation functions [2], that complicates or makes impossible application of traditional optimization methods [1]. Research on constructive and destructive algorithms represents an effort to- wards the automatic design of architectures. A constructive algorithm starts with a minimal network (network with minimal number of hidden layers, nodes, and connections) and adds new layers, nodes, and connections when necessary during training while a destructive algorithm does the opposite, i.e., starts with the maximal network and deletes unnecessary layers, nodes, and connections during training. However, such structural hill climbing methods are susceptible to be- A.A. Oliinyk, S.A. Subbotin ISSN 1681–6048 System Research & Information Technologies, 2015, № 1 78 coming trapped at structural local optima. In addition, they only investigate restricted topological subsets rather than the complete class of network architec- tures [3]. For the synthesis of neural network it is expedient to use methods of evolu- tionary search that are a family of computational models inspired by evolution. These methods differ from more traditional optimization techniques in that they involve a search from a population of solutions, not from a single point. Each iteration of an evolutionary method involves a competitive selection that weeds out poor solutions. The solutions with high fitness are recombined with other solutions by swapping parts of a solution with another [4]. However, the result of evolutionary optimization is the set of equal or few distinguished decisions. Therefore, the optimum structure of neural network can be not found because classical evolutionary methods can not uniformly cover search space, and large areas in space of variables can appear not investigated for the limited amount of iterations. Therefore, the purpose of this work is a development of a multimodal method of evolutionary search which raises a diversity of a population and allows to cover in regular more intervals space of search which result is a set of various decisions (structures of neural network), that allows to choose architecture of neu- ral network, in the best way satisfying external criteria. PROBLEM STATEMENT Let A be a maximal allowable quantity of neurons in the network, >< YX , is a sample of training data, where }{ iXX = is a set of feature values describing considered object or process; }{ pyY = is a set of target values; }{ ipi xX = is an i-th feature in the sample, ;,,2,1 Li K= ipx is a value of i-th feature for p-th ob- servation of the sample, ;,,2,1 mp K= py represents the value of the predicted parameter for p-th observation of the sample; L is a quantity of features in the sample; m is a quantity of observations. So the problem of structural synthesis of neural network )(CNNNN = can be formulated as a search problem opt),,( →= YXNNξ in architecture space, where ),( ALCC = is a matrix determining presence or absence of the connec- tions between elements in the network ,NN ),,( YXNN=ξ is an optimality cri- teria, e.g., lowest training error, lowest network complexity, etc. STRUCTURAL SYNTHESIS OF NEURAL NETWORK BASED ON EVOLUTIONARY OPTIMIZATION For application of evolutionary search for neural network synthesis it is necessary to determine a scheme of representation of network structure in a chromosome and to choose a fitness-function for estimation of chromosomes. Neural network synthesis based on evolutionary optimization Системні дослідження та інформаційні технології, 2015, № 1 79 There are following methods of encoding of the information on neural net- work structure in chromosomes [3]: direct encoding, parametric representation, developmental rule representation, fractal representation, etc. One of the most effective encoding method is a direct encoding at which presence of each possible connection is directly described in a binary matrix of connections ,C where value ij c corresponds to presence )1( =ijc or absence )0( =ijc of connection from i-th to j-th neuron. Thus, the neural network is represented as a connectivity matrix. The chromosome (Fig. 1) in direct encoding scheme is represented by the bit line containing the information about presence of connections. Chromosome decoding to the structure of neural network occurs as follows. Step 1. Generate connectivity matrix (Fig. 1, b) of neural network, corre- sponding to a chromosome (Fig. 1, a). Step 2. Construct a graph (Fig. 1, c) based on the connectivity matrix. Step 3. Synthesize a neural network (Fig. 1, d) on the basis of the graph con- structed on the previous step, having removed thus neurons, not having target connections with neurons of the subsequent layers. In case of need a choice of neuron activation function at structural synthesis it is possible to enter in a chromosome the additional genes containing the infor- mation of a kind of activation function for each neuron. Structural synthesis of neural network based on the evolutionary approach can be executed as the following sequence of steps [5–8]. Step 1. Generate the initial population of chromosomes containing the in- formation of network’s structure. Step 2. Compute the fitness of each chromosome in the current population. Step 2.1. Decode each chromosome in the population into architecture of the neural network. Step 2.2. Train each neural network by the chosen rule using the data from training sample. Step 2.3. Calculate value of the fitness-function considering a training error and complexity of constructed neural network. Fig. 1. An example of a chromosome and its decoding: a — chromosome, b — connec- tivity matrix, c — graph (architecture), d — synthesized neural network ⎟ ⎟ ⎟ ⎟ ⎟ ⎟ ⎟ ⎟ ⎟ ⎠ ⎞ ⎜ ⎜ ⎜ ⎜ ⎜ ⎜ ⎜ ⎜ ⎜ ⎝ ⎛ = 0000000 000000 00000 0000 000 00 0 1 00 100 0110 01110 000000 C 7 4 5 6 1 2 3 (х1) (х2) (х3) 7 4 6 х2 х3 у b c d 0 0 0 0 0 0 0 1 1 1 0 0 1 1 0 0 0 1 0 0 1 a A.A. Oliinyk, S.A. Subbotin ISSN 1681–6048 System Research & Information Technologies, 2015, № 1 80 Step 3. Check up search termination criteria. In the case of their satisfaction go to a step 7. Step 4. Select the most fitted chromosomes for their crossing and mutation. Step 5. Execute crossover and mutation operators on chromosomes selected earlier. Step 6. Create new generation from the chromosomes obtained on the previ- ous step and the elite chromosomes of the current generation. Go to a step 2. Step 7. End. A MULTIMODAL EVOLUTIONARY METHOD FOR STRUCTURAL SYNTHESIS OF NEURAL NETWORKS It is shown in [3] that the surface of performance function for neural networks structural synthesis is nondiferentiable, noisy, complex and multimodal since dif- ferent architectures may have similar performance. The result of usage of classical evolutionary methods is the population of a few distinguished solutions therefore the found decision can appear a local op- timum of multiextreme function. Such a decision (structure of neural network), as a rule, is inefficient at its usage in practice. Therefore, for structural synthesis of neural networks it is expedient to use evolutionary methods capable to find several suboptimum decisions. The main problem of usage of traditional evolutionary methods for optimization of multi- modal functions is a premature convergence to a local optimum. For overcoming this problem two groups of methods are developed: avoid strategies and repair strategies [9–13]. In avoid strategies method, the main idea is to prevent premature conver- gence to a local optimum [9–12]. The algorithms attempting to slow down genetic convergence aim at maintaining the population’s diversity for a longer period and thereby avoid stagnation in a local optimum. Algorithms in this category either use a replacement scheme for updating the population or try to reduce the spread of genes by introducing a spatial population topology. The strategies trying to prevent overlap of solutions using penalty functions for reduction of the probabil- ity of occurrence in a population of similar solutions that attracts necessity of penalty function calculation for each chromosome in the population, hence, con- siderably slows down process of evolutionary search. In repair strategies method, algorithms either maintain diversity by mass ex- tinction techniques or by introducing new genetic material when population con- vergence is detected, that also demands significant time expenses [9, 13]. In the developed method of multimodal evolutionary search with chromo- some clustering it is offered to group solutions (chromosome) in cluster on their arrangement in a search space. The suggested method during evolutionary search defines the groups of similar chromosomes and raises a variety of a population by reducing the of fit- ness function values of chromosomes depending on a closeness to the center of their group. Neural network synthesis based on evolutionary optimization Системні дослідження та інформаційні технології, 2015, № 1 81 The developed polymodal evolutionary search with the chromosome cluster- ing assumes the execution of the following steps. Step 1. Set: the quantity of optimums (the quantity optimum architectures of neural network) k which is required to be found during evolutionary search; N represents the quantity of chromosomes in a population, .kN >> Step 2. Set the counter of iterations: .1=t Step 3. Set the quantity of elite chromosomes: .kke = Step 4. Initialize an initial population with chromosomes iH ),,2,1( Nj K= with length L (the quantity of features). Step 5. Calculate the fitness function value )( jHf for each chromo- some .jH Step 6. Group chromosomes in k clusters based on their fitness function values and an arrangement in an architecture space. Step 6.1. For each chromosome jH calculate Hamming distance [2] to all other chromosomes in a population. Hamming distance d between chromosomes jH and lH is calculated by the formula: ,|| 1 ∑ = −= L u luju hhd where juh and luh are the values of genes of chromosomes jH and ,lH respec- tively Step 6.2. Set the counter of generated clasters: .1=m Step 6.3. Choose a chromosome with the best fitness function value as the center of m-th cluster. Thus the chromosomes which yet have been not grouped in clusters are considered. Step 6.4. Add in cluster )1/( −kN chromosomes nearest on Hamming dis- tance to a chromosome, being the center of current m-th cluster. Step 6.5. If km = then go to a step 7. Step 6.6. Set: .1+= mm Go to a step 6.3. Step 7. Reduce fitness function values of the chromosomes which are not being the best in cluster using the formula: , max, , j s j j jn f d d f ⎟ ⎟ ⎠ ⎞ ⎜ ⎜ ⎝ ⎛ = where jf is the fitness function value before j-th chromosome changing; jnf , is a new fitness function value of j-th chromosome; jd is a Hamming distance from j-th chromosome to the center of its group; jdmax, is a maximal Hamming dis- tance in the cluster of j-th chromosome; s is the parameter determining a degree of fitness function reduction of chromosomes, not being the centers of cluster, .1≥s Step 8. Apply crossover and mutation operators. A.A. Oliinyk, S.A. Subbotin ISSN 1681–6048 System Research & Information Technologies, 2015, № 1 82 Step 9. Generate new population. Thus the best (elite) chromosomes in every cluster are guaranteed pass in the new generation. Step 10. If Tt = (T is the maximum possible quantity of iterations), then go to a step 13. Step 11. Set: .1+= tt Step 12. Go to a step 5. Step 13. Estimate each of k the chromosomes being the centers of clusters, with the help of the data of test sample. Choose the best chromosome. Neural network, corresponding to such chromosome, is the solution. Step 14. End. The developed method of multimodal evolutionary search with chromosome clustering raises a variety of a population and allows to cover in regular more in- tervals search space, raising thus an opportunity of search of a global optimum and increasing probability of successful execution of estimation procedure of the founded solutions with the help of the external criteria on test sample. EXPERIMENTS AND RESULTS The suggested method of multimodal evolutionary search with the chromosome clustering has been realised as computer program. The experimental research of the offered method of neural network synthesis was carried out based on the deci- sion of a vehicle classification problem by 2d gray-scale images. The initial sample contained the transformed graphic representations of ve- hicles received from video cameras at streets in Zaporozhye, Ukraine. Sample consisted of 1062 vehicle images, each of which was characterized by 4096 fea- tures representing normalized values of the image points intensity projected on a sensor matrix of 64×64 pixels. Using these 4096 features there were calculated 26 generalizing features. Vehicles were classified on cars, minibuses, motorcy- cles, trucks and buses. For each class of transport the model was constructed. Thus the problem has consisted in synthesis of four classification models of each type of vehicles based on 26 generalizing features [14]. The coding of a potential solution was performed like at Fig. 1. There are evolutionary operators were used: roulette wheel selection, uniform crossover, simple mutation. The initial parameters of all evolutionary methods were estab- lished by the following: population size ;100=N crossover probability ;8,0=crp mutation rate .02,0=rm Stopping criteria: maximum number of itera- tions ;100=T achievement of comprehensible value of the fitness function equal to .01.0 The purpose of the experiments was to synthesize the optimal neural network model. The maximal allowable sum of squared error on this model for training data and test sample is 0,01 and 0,02, respectively. Results of experiments for different quantity of clusters are presented in the table 1 lsSSE( represents a sum of squared error for learning sample, tsSSE is a sum of squared error for test sample, n repre- sents the quantity of the obtained models providing sufficient value of sum of squared error for test sample). Neural network synthesis based on evolutionary optimization Системні дослідження та інформаційні технології, 2015, № 1 83 T a b l e 1 . Results of Experiments Quantity of clusters lsSSE tsSSE n 1 0,0098 0,0272 0 2 0,0095 0,0227 0 3 0,0096 0,0183 1 4 0,0094 0,0142 1 5 0,0095 0,0137 2 The example of running different evolutionary methods for the decision of a vehicles classification problem is shown in Fig. 2. The comparison of the proposed method with other evolutionary methods is presented in the table 2 (τ represents a time for evolutionary optimisation, countf is the quantity of fitness function calculation). The best solution (neural network structure) found by the proposed method is shown in Fig. 3. T a b l e 2 . Comparison of Evolutionary Methods Method τ countf lsSSE tsSSE Canonical genetic algorithm 709,3 9619 0,0098 0,0272 Avoid strategies method 587,9 8018 0,0095 0,0191 Repair strategies method 629,8 8714 0,0097 0,0158 Multimodal evolutionary search 521,7 7092 0,0095 0,0137 Fig. 2. An example of evolutionary methods running A.A. Oliinyk, S.A. Subbotin ISSN 1681–6048 System Research & Information Technologies, 2015, № 1 84 The weight matrix of synthesized neural network is presented in the table 3, where μ represents the number of layer in the network; ρ is the number of neuron in the layer; b is the value of bias; node is the first node in the connection; w is the value of weight. The experiments have shown that as a result of application of multimodal evolutionary search with the chromosome clustering stable subpopulations are formed in different clusters, heterogeneity of search is provided, and also conver- gence to different local minima is reached. CONCLUSION The new method of multimodal evolutionary search with a chromosome cluster- ing is offered in this paper. The developed method is based on idea of simultane- ous search of several optimums, thus solutions (architectures) are grouped in clus- ters on their arrangement in architecture space that results in more uniform covering of search space. Comparison of the results obtained with the help of the developed method with results of application of classical evolutionary methods shows that the offered method allows to synthesize closer to optimum neural net- works because of more uniform covering of search space. Thus, the suggested method can be recommended for application in practice for solving different problems in pattern recognition and computational diagnosis. Fig. 3. The best solution (neural network structure) x1 x7 x10 x12 x2 x5 x6 x20 x23 x7 x8 x11 x15 x5 x6 x18 x23 x2 x10 x25 9 y Neural network synthesis based on evolutionary optimization Системні дослідження та інформаційні технології, 2015, № 1 85 Table 3. The Weight Matrix of Synthesized Neural Network μ ρ b node w feature х1 –1,7264 feature х7 2,2621 feature х10 0,8386 1 – 0,1852 feature х12 – 0,1721 feature х2 – 0,7502 feature х5 – 1,1064 feature х6 2,1628 feature х20 1,8531 2 – 0,5291 feature х23 1,0372 feature х7 0,8580 feature х8 – 0,6082 feature х11 – 0,5810 3 1,2674 feature х15 – 1,1310 feature х5 0,5502 feature х16 1,3051 feature х18 1,2109 4 – 0,9620 feature х23 – 2,0945 feature х2 1,0845 feature х10 – 1,6093 1 5 0,4281 feature х25 0,8803 neuron 1 0,7726 6 –1,6528 neuron 3 – 0,6190 neuron 2 – 1,5278 neuron 3 – 1,7429 7 2,9381 neuron 4 0,9726 neuron 3 0,7904 neuron 4 0,4086 2 8 0,7804 neuron 5 – 1,0462 neuron 6 0,8054 neuron 7 – 1,7960 3 9 0,3174 neuron 8 0,9467 REFERENCES 1. Ripley B. Pattern Recognition and Neural Networks. — Cambridge: Cambridge Uni- versity Press, 2008. — 416 p. 2. Yao X. Evolving Artificial Neural Network // Proceedings of the IEEE. — 1999. — 87, № 9. — P. 1423–1447. 3. Haykin S. Neural Networks: A Comprehensive Foundation. — New Jersey: Prentice Hall, 1999. — 842 p. 4. Haupt R., Haupt S. Practical Genetic Algorithms. — Hoboken: John Wiley & Sons, 2004. — 272 p. 5. Siebel N.T., Kassahun Y. Learning Neural Networks for Visual Servoing Using Evo- lutionary Methods // Hybrid Intelligent Systems: Sixth International Conference, 13–15 December 2006, Auckland: Proceedings. — Los Alamitos: IEEE, 2006. — P. 6–14. A.A. Oliinyk, S.A. Subbotin ISSN 1681–6048 System Research & Information Technologies, 2015, № 1 86 6. Rocha M., Cortez P., Neves J. Simultaneous evolution of neural network topologies and weights for classification and regression // Computational Intelligence and Bioinspired Systems. — Berlin: Springer, 2005. — P. 59–66. 7. Thammano A., Meengen A. A New Evolutionary Neural Network Classifier // Ad- vances in Knowledge Discovery and Data Mining. — Berlin: Springer, 2005. — P. 249–255. 8. Siebel N., Krause J., Sommer G. Efficient learning of neural networks with evolu- tionary algorithms // Pattern Recognition: 29th DAGM conference on: proceed- ings. — Berlin: Springer, 2007. — P. 466–475. 9. Ursem R.K. Multinational Evolutionary Algorithms // Evolutionary Computation: CEC 99 Congres: proceedings. — Los Alamitos, IEEE, 1999. — P. 1633–1640. 10. Shimodaira H. A Diversity Control Oriented Genetic Algorithm (DCGA): Develop- ment and Experimental Results // Genetic and Evolutionary Computation. — Or- lando: Morgan Kaufmann, 2000. — P. 603–611. 11. Thomsen R., Rickers P., Krink T. A Religion-Based Spatial Model For Evolutionary Algorithms // Parallel Problem Solving from Nature. — Berlin: Springer, 2000. — P. 817–826. 12. Tsutsui S., Fujimoto Y., Ghosh A. Forking Genetic Algorithms: GAs with Search Space Division Schemes // Evolutionary Computation. — 1997. — 5. — P. 61–80. 13. Ursem R.K. Diversity-Guided Evolutionary Algorithms // Parallel Problem Solving from Nature. — Berlin: Springer, 2002. — P. 462–471. 14. Олейник А.А. Выбор системы информативных признаков для классификации транспортных средств на основе эволюционного поиска // Комп’ютерне моделювання та інтелектуальні системи: збірник наукових праць / За ред. Д.М. Пізи, С.О. Субботіна. — Запоріжжя: ЗНТУ, 2007. — С. 134–146. Received 14.02.2014 From the Editorial Board: the article corresponds completely to submitted manuscript.