See a Problem?
Fuzzy Syst. A, Syst. Humans, 27 5 , —, Neural Comput. Soft Comput. Neural Networks, 19 12 , —, Neurocomputing, 78 1 , —, International Journal of the Physical Sciences, 6, , Neurocomputing, 71 1 , —, Man Cybern. Computer Science Review, 3 1 , , Wiley Interdisc. Springer-Verlag, Berlin, Heidelberg, Learning Processes, and Artificial Intelligence, , , In Proceedings of AI , , Neurocomputing, 71, , Aied Soft Computing, 11 1 , , Knowledge-based System, 27, , Soft Computing, 13, , Aoximation Theory, Wavelets, and Aications, , , Journal of Aoximation Theory, 92 2 , , Journal of Aoximation Theory, 64 1 , 54, Journal of Mathematical Analysis and Aications, 1 , , Computers andMathematics with Aications, 43 3 , , Connectionist Aoaches in Economics andManagement Sciences, , Proc London Math Soc, 71, , Braess, L.
Schumaker Eds. Numerical Algorithms, 25 1 , , Computational Mechanics, 21 2 , , A, Chen C. Engineering Analysis with Boundary Elements, 23 4 , , Boundary Elements Comm.
Radial Basis Function Networks 2: New Advances In Design (Studies In Fuzziness And Soft Computing)
Model, 20, , In: K. Jetter, F. Utreras Eds. Computing Suement, 10, , A Math Comput. Chinese Ann. A, 14, , Constr Aox, 4, , Aoximation Theory IX, 2, , K, Light W. SIAM J. Comput, 22 5 , , Neural Network World, 10 , Neural Networks, 5 4 , , Systems Signals and Devices.
Most Downloaded Articles
Neural Information Processing-Letters and Reviews, 1 1 , , Neural Networks, 7 2 , , In Proceedings of Int. Joint Con. Pattern Recognition Letters, 19 14 , , Antennas and Propagation, 45 11 , , In the Proceedings of the Congress on Evolutionary Computation, In: Martin R. Engineering Aications of AI, 26 10 , , Analytica Chimica Acta, 3 , , Sensors, 9, , Computer Speech and Language, 4 3 , , Citations Publications citing this paper. Differential evolution trained radial basis function network: application to bankruptcy prediction in banks Nekuri Naveen , Vadlamani Ravi , C.
Raghavendra Rao , Nikunj Chauhan. Chamon , Santiago Paternain , Alejandro Ribeiro. References Publications referenced by this paper. Neural Networks for Pattern Recognition. Moody , Christian J. Radial basis function networks 2: new advances in design Robert J. Performance measurement of the method was conducted and the experimental results indicate that our proposed method performs better in error rate convergence and correct classification compared to the result with continuous dataset.
T-test statistical analysis was used to check the significance of the results and most were found to be satisfactorily significant. SlideShare Explore Search You. Submit Search. Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime. Upcoming SlideShare. Like this document? Why not share! Embed Size px. Start on. Show related SlideShares at end. WordPress Shortcode. Institute of Advance Engineering Science Follow. Published in: Engineering. Full Name Comment goes here.
- Account Options?
Are you sure you want to Yes No. No Downloads. Views Total views. Actions Shares. Embeds 0 No embeds.
No notes for slide. All rights reserved. Introduction Artificial Neural Network ANN was developed as a parallel distributed system inspired by human brain learning process. ANN learning capacity to solve problems through training makes it popular. It has only one hidden layer unlike other types of ANN that have one or more hidden layers. It is a feed forward and fully connected network .hearttravadbaddi.tk
Radial Basis Function Networks 2: New Advances In Design by Robert J. Howlett
RBFN has several benefits above its predecessors. Some of the benefits include having simple network architecture, ability to approximate better and also its algorithms learn faster. RBFNs are used to solve problems such as classification problems, function approximation, system control, and time series prediction. As a result, RBFN it is widely used in science and engineering. Thus, this study attempts to explore the performance of RBFN by determining values for error rate convergence or learning rate and correct classification accuracy of the network.
The rest of the paper is organized into sections. In section 4, Results and Discussion of the findings are given. The Conclusion of the study is given in section 5, and finally the References consulted are listed in section 6.
Proposed Method In RBFN training, clustering algorithms are used to determine the centre and weight of the hidden layer. On the other hand, Least Mean Squares LMS algorithms are employed in determining the network weights between hidden and output layer . Weights of network neurons thou can be adjusted have a value of 1 between input layers and hidden layers . The hidden layer nodes determine behaviour and structure of the network.
Gaussian function is used to activate the hidden layer . The numerous synaptic associations that link the billions of neurons in animals are modified to understand new things. Likewise, an ANN behaves in the same way by mimicking 2. It modifies its weights to enable it learn new patterns. Therefore, ANN is a machine learning process inspired, by our brain.
Publisher with Creative Commons License
Every node of the artificial neuron has an activation function which is in charge of mapping the inputs of the neuron to its output . Backpropagation BP algorithm has been widely used as training algorithm for ANN in supervised learning mode . A study by  reported that at present ANN have numerous real life applications in machine learning and other fields.
RBFN is trained using unsupervised and supervised learning modes. The unsupervised mode is implemented between input to hidden layers and supervised mode is implemented from hidden to output layer. Functions given by hidden neurons create random starting point for input patterns . Clustering algorithms are capable of finding cluster centres that best represents the distribution of data in the first stage of the training.
Supervised learning is guided by the desired target values for the network. Using modified BP to train the network with modified cost function proved by other studies to have enhanced ANN learning is implemented  but in our case, discretized datasets was used which also was reported to enhance the speed of learning . In RBFN, each layer of the network performs a different tasks and that is one of the main problems with this network. Therefore, separating the processes of the layers with various techniques is a good idea. In this paper, five standard classification problems dataset was used to test the proposed algorithm.
It is one of the supervised learning algorithms that use gradient descent technique to reduce the cost function. Many factors affect the performance of BP algorithm, such as initial parameters, initial weight, rate of learning, momentum term, size of network, number of epoch, connection between the units.
- Pearls of Glaucoma Management.
- Effective Study: Learn More in Less time (1).
- Race Relations in South Africa 1929–1979?
- Account Options;
- Recommended for you.
The learning performance of BP depends on network parameters. A good choice of these parameters may greatly improve performance. It generally takes BP a long time to train .
Related Radial Basis Function Networks 2: New Advances in Design
Copyright 2019 - All Right Reserved