Deep learning fuzzy classifier
DOI:
https://doi.org/10.18372/1990-5548.60.13813Keywords:
Neural network, fuzzy neural network, deep learningAbstract
It is considered a classification problem solution based on analysys of represented review. It’s shown that the neural networks have important advantages beside other methods, such as: classification using the nearest neighbor method, support vector classification, classification using decision trees, etc. Amount of artificial neural networks exists futher networks have the simplest structure, but the precision of the solution can be increased with help of deep learning approach, which is supposes the use of additional neural network for the solution of pretraining tasks(deep believe networks). It’s proposed new tophology which consist of: Takagi-Sugeno-Kang fuzzy classifier and Limited Boltzmann Machine neural network. Despite on this thopology was proposed early in this article it’s carried out enough researches that permited to specify the learning algorithm. An example of proposed algorithm implantation is represented.References
G. E. Hinton, and R. R. Salakhutdinov, “Reducing the dimensionality of data with neural networks,” Science, 28 July 2006, vol. 313, pp. 504–507. www.sciencemag.org
G. E. Hinton, S. Osindero, and Y. Teh, “A fast learning algorithm for deep belief nets,” Neural Computation, 18, pp. 1527–1554, 2006.
Graves, Supervised Sequence Labelling with Recurrent Neural Networks, 2012.
Gers, Schmidhuber, and Cummins, “Learning to Forget: Continual Prediction with LSTM,” 1999.
Hochreiter and Schmidhuber. “Long short-term memory,” 1997.
M. Sugeno and G. T. Kang, “Structure identification of Fuzzy Model,” Fuzzy Sets and Systems, vol. 28, Issue 1, October 1988, pp. 15–33.
Goodfellow et al. “Generative Adversarial Networks,” 2014.
Gehring, et al. “A Convolutional Encoder Model for Neural Machine Translation,” 2016.
Nogueira dos Santos and Gatti, “Deep Convolutional Neural Networks for Sentiment Analysis of Short Texts,” 2014.
Y. Bengio, P. Lamblin, D. Popovici, and H. Larochelle, "Greedy layer-wise training of deep networks," in Advances in Neural Information Processing Systems 19 (NIPS'06), (B. Schölkopf, J. Platt, and T. Hoffman, eds.), pp. 153–160, MIT Press, 2007.
Wang Shitong and Korris Fu-Lai Chung, “On Least Learning Machine,” Journal of Jiangnan University (Natual Science Edition), vol. 9, pp. 505–510, Feb. 2010.
Ta Zhou, Fu-lai Chung, and Shitong Wang, “Deep TSK Fuzzy Classifier with Stacked Generalization and Triplely Concise Interpretability Guarantee for Large Data,” IEEE Transactions on Fuzzy Systems, vol. 23, no. 4, pp. 813–826, August 2016.
D. Rutkowska, “Neuro-fuzzy Architectures and Hybrid Learning,” Physica Verlag, New York, 2002.
Andrew Hoblitzell, Meghna Babbar-Sebens, Snehasis Mukhopadhyay, “Fuzzy and deep learning approaches for user modeling in wetland design,” 2016, IEEE International Conference on Systems, Man, and Cybernetics(SMC).
G. E. Hinton, “A Practical Guide to Training Restricted Boltzmann Machines,” http://learning.cs.toronto.edu, Aug. 2010.
G. E. Hinton, S. Osindero and Y. W. The, “A faster learning algorithm for deep belief nets,” Neural Comput, vol.1, no.7, pp.1527–1544, 2006
R. Kruse, C. Borgelt, F. Klawonn, C. Moewes, M. Steinbrecher, and P. Held, Computational Intelligence. A Methodological Introduction. Berlin: Springer, 2013.
Zhaohong Deng, Longbing Cao, Yizhang Jiang and Shitong Wang, “Minimax Probability TSK Fuzzy System Classifier: A More Transparent and Highly Interpretable Classification Model,” IEEE Transactions on Fuzzy Systems, vol. 23, no. 4, pp. 813–826, Apr. 2015.
Simon Haykin: Adaptive Filter Theory, Prentice Hall, 2002.
J.-S.R. Jang, “ANFIS: adaptive-network-based fuzzy inference system,” IEEE Transactions on Systems, Man and Cybernetics, 1993.
Downloads
Issue
Section
License
Authors who publish with this journal agree to the following terms:
Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).