Speeding Up Backpropagation Using Multiobjective Evolutionary Algorithms


Abstract

The use of backpropagation for training artificial neural networks (ANNs) is usually associated with a long training process. The user needs to experiment with a number of network architectures; with larger networks, more computational cost in terms of training time is required. The objective of this letter is to present an optimization algorithm, comprising a multiobjective evolutionary algorithm and a gradient-based local search. In the rest of the letter, this is referred to as the memetic Pareto artificial neural network algorithm for training ANNs. The evolutionary approach is used to train the network and simultaneously optimize its architecture. The result is a set of networks, with each network in the set attempting to optimize both the training error and the architecture. We also present a self-adaptive version with lower computational cost. We show empirically that the proposed method is capable of reducing the training time compared to gradient-based techniques.