top of page

Group

Public·28 members

Neural Smithing: Supervised Learning In Feedforward Artificial Neural Networks Download


In this paper, optimal number of neurons in one-hidden-layer artificial neural networks is investigated. Theoretical and statistical studies are carried out for this goal. Finding the global minimum is necessary in order to determine the optimal number of neurons. However, since the training of artificial neural networks is a non-convex problem, it is difficult to find a global minimum with optimization algorithms. In this study, an augmented cost function is proposed to find the global minimum, hence the optimal number of neurons. It is shown that the optimal number of neurons is produced by the artificial neural network model, which gives the global minimum with the aid of the augmented cost function. Additionally, the XOR and circle datasets are used to test the augmented cost function, and 99% success was achieved on the XOR dataset and 97% on the circle dataset. The optimal number of neurons is determined for these datasets.




Neural Smithing: Supervised Learning In Feedforward Artificial Neural Networks Download


About

Welcome to the group! You can connect with other members, ge...
Group Page: Groups_SingleGroup
bottom of page