ALGORITHM OF NEURON NETWORKS MODIFICATION
DOI:
https://doi.org/10.18372/1990-5548.56.12936Keywords:
Neuron networks, optimization problem, hybrid multicriteria evolutionary algorithm, method of steepest descent, algorithm of merging and growingAbstract
It is considered a problem of neuron network modification whose topology has been chosen previously as a result of optimization problem solution for given task. The proposed modification algorithm is based on two-stages procedure which consists of genetic algorithm and local algorithm of optimization. The problem of modification is represented as two tasks: the search of optimal neuron network structure and weight coefficients adjustment. For the solution of these two problems it is used two-stages algorithm, in which at the first stage it is applied hybrid multicriteria evolutionary algorithm and at the second stage it is determined values of weight coefficients with help of back propagation error method and method of steepest descent. The determination of optimal values of hidden layers quantity is executed with help of adaptive algorithm of merging and growing.
References
D. E. Rumelhart, G. E. Hinton, and R. J. Williams, “Learning internal representations by error propagation,” in Parallel Distributed Processing, vol. I, D. E. Rumelhart and J. L. McClelland, Eds. Cambridge, MA: MIT Press, 1986, pp. 318–362.
T. Y. Kwok and D. Y. Yeung, “Constructive algorithms for structure learning in feedforward neural networks for regression problems,” IEEE Trans. Neural Netw., vol. 8, no. 3, pp. 630–645, May 1997.
R. Reed, “Pruning algorithms – A survey,” IEEE Trans. Neural Netw., vol. 4, no. 5, pp. 740–747, Sep. 1993.
F. Girosi, M. Jones, and T. Poggio, “Regularization theory and neural networks architectures,” Neural Comput., vol. 7, no. 2, pp. 219–269, Mar. 1995.
J. H. Holland, Adaptation in Natural and Artificial Systems. Ann Arbor, MI: Univ. Michigan Press, 1975.
L. J. Fogel, A. J. Owens, and M. J. Walsh, Artificial Intelligence Through Simulated Evolution. New York: Wiley, 1966.
D. B. Fogel, Evolutionary Computation: Toward a New Philosophy of Machine Intelligence. New York: IEEE Press, 1995.
Md. Monirul Islam, Md. Abdus Sattar, Md. Faijul Amin, Xin Yao, Fellow, IEEE, and Kazuyuki Murase, “A New Adaptive Merging and Growing Algorithmfor Designing Artificial Neural Networks,” IEEE Transactions on Systems, Man, and Cybernetics – Part B: Cybernetics, vol. 39, no. 3, June 2009, pp. 705–709.
V. M. Sineglazov, O. I. Chumachenko, and D. Koval, "Improvement of the Hybrid Genetic Algorithm for
the Deep Neural Networks Synthesis," IV International Scientific and Practical Conference "Computing Intellect" (Kyiv, May 16-18, 2017), pp. 142–143.
Downloads
Issue
Section
License
Authors who publish with this journal agree to the following terms:
Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).