Using a Gradient Based Method to Seed an EMO Algorithm


Abstract

In the field of single-objective optimization, hybrid variants of gradient based methods and evolutionary algorithms have been shown to performance better than the pure evolutionary method. This same idea has been used with Evolutionary Multiobjective Optimization (EMO), obtaining also very promising results. In most of the cases, gradient information is used as part of the mutation operator, in order to move every generated point to the exact Pareto front. This means that gradient information is used along the whole process, and then consumes computational resources also along the whole process. On the other hand, in our approach we will use gradient information only at the beginning of the process, and will show that quality of the results is not decreased while computational cost is. We will use a steepest descent method to generate some efficient points to be used to seed an EMO method. The main goal will be generating some efficient points in the exact front using the less evaluations as possible, and let the EMO method use these points to spread along the whole Pareto front. In our approach, we will solve box-constrained continuous problems, gradients will be approximated using quadratic regressions and the EMO method will be based on Rough Sets theory Hernandez-Diaz et al. (Parallel Problem Solving from Nature (PPSN IX) 9th International Conference, 2006).