Properties of Scalable Distance Minimization Problems Using the Manhattan Metric


In multi-objective optimization, scalable test problems are required to test and compare the search abilities of the algorithms in solving large and small-dimensional problems. In this paper, we analyze a generalized Distance Minimization Problem (DMP) that is scalable in the number of decision variables and objectives and can be used with any distance function. Since previous research mostly regarded the behaviour of algorithms for Euclidean distances, in this work, we propose to use the Manhattan metric to measure the distances of solutions towards a set of predefined locations in the decision space. The structure of the Pareto-fronts of this problem widely differ from those of the euclidean problem. We perform an analytical analysis exemplary for low-dimensional instances of the problem to provide an understanding of the general properties and structure, and the challenges that might arise in many-objective many-variable instances. The negative effects on the search behaviour of algorithms are theoretically described, and three different optimization methods (MOEA/D, NSGA-II, SMPSO) are tested to give an understanding of different instances of the problem. The experimental results support our expectations and show that the proposed Manhattan metric DMP is difficult to solve for optimization algorithms even in low-dimensional spaces.