e-space
Manchester Metropolitan University's Research Repository

On global-local artificial neural networks for function approximation

Wedge, David C., Ingram, David M., McLean, David A., Mingham, Clive G. and Bandar, Zuhair A. (2006) On global-local artificial neural networks for function approximation. IEEE Transactions on Neural Networks, 17 (4). pp. 942-952. ISSN 1045-9227

[img]
Preview

Download (617kB) | Preview

Abstract

We present a hybrid radial basis function (RBF) sigmoid neural network with a three-step training algorithm that utilizes both global search and gradient descent training. The algorithm used is intended to identify global features of an input-output relationship before adding local detail to the approximating function. It aims to achieve efficient function approximation through the separate identification of aspects of a relationship that are expressed universally from those that vary only within particular regions of the input space. We test the effectiveness of our method using five regression tasks; four use synthetic datasets while the last problem uses real-world data on the wave overtopping of seawalls. It is shown that the hybrid architecture is often superior to architectures containing neurons of a single type in several ways: lower mean square errors are often achievable using fewer hidden neurons and with less need for regularization. Our global-local artificial neural network (GL-ANN) is also seen to compare favorably with both perceptron radial basis net and regression tree derived RBFs. A number of issues concerning the training of GL-ANNs are discussed: the use of regularization, the inclusion of a gradient descent optimization step, the choice of RBF spreads, model selection, and the development of appropriate stopping criteria.

Impact and Reach

Statistics

Activity Overview
6 month trend
246Downloads
6 month trend
294Hits

Additional statistics for this dataset are available via IRStats2.

Actions (login required)

View Item View Item