The hybridization of multi-objective evolutionary algorithms (MOEAs) with mathematical programming techniques has gained increasing popularity in the specialized literature in the last few years. However, such hybrids normally rely on the use of gradients and, therefore, normally consume a high number of extra objective function evaluations in order to estimate the gradient information required. The use of direct (nonlinear) optimization techniques has been, however, less common in the specialized literature, although several hybrids of this sort have been proposed for single-objective evolutionary algorithms. This paper purposes a hybridization between a well-known MOEA (the NSGA-II) and two direct search methods (Nelder and Mead's method and the golden section algorithm). The aim of the proposed approach is to combine the global search mechanisms of the evolutionary algorithm with the local search mechanisms provided by the aforementioned mathematical programming techniques, such that a more efficient (i.e., with a lower number of objective function evaluations) approach can be produced.