KURSAWE
Proposed by Kursawe  [4].; $ P_{true}$ disconnected, $ Pf_{true}$disconnected Minimize $ F =

(f_1(\vec{x}), f_2(\vec{x}))$, where
$\displaystyle f_1(\vec{x})$ $\displaystyle =$ $\displaystyle \sum\limits_{i=1}^{2}-10e^{\text{-0.2}
\sqrt{x_{i}^{2}+x_{i+1}^{2}}},$  
$\displaystyle f_2(\vec{x})$ $\displaystyle =$ $\displaystyle \sum\limits_{i=1}^{2}\left( \left\vert
x_{i}\right\vert
 ^\text{0.8}+5\sin (x_{i}^{3})\right)$  


$\displaystyle -5 \leq$ $\displaystyle x_{i}$ $\displaystyle \leq 5,$  
$\displaystyle i$ $\displaystyle =$ $\displaystyle 1,2, 3$  

kursavarkursafun
Kursawe Pareto Front (data file)

DEB Bimodal
Proposed by Deb [1].; $ P_{true}$connected, $ Pf_{true}$connected Minimize $ F =

(f_1(\vec{x}), f_2(\vec{x}))$, where
$\displaystyle f_1(\vec{x})$ $\displaystyle =$ $\displaystyle x_1,$  
$\displaystyle f_2(\vec{x})$ $\displaystyle =$ $\displaystyle \frac{g(\vec{x})}{x_1},$  
$\displaystyle g(\vec{x})$ $\displaystyle =$ $\displaystyle 2.0 - e^{-\left(\frac{x_2 - 0.2}{0.004}\right)^2}
-
 0.8 \cdot e^{-\left(\frac{x_2 - 0.6}{0.4}\right)^2}$  


$\displaystyle 0.1 \leq$ $\displaystyle x_{i}$ $\displaystyle \leq 1,$  
$\displaystyle i$ $\displaystyle =$ $\displaystyle 1,2$  

deb2fun
Deb Pareto Front (data file)

KITA
Proposed by Kita  [3]; $ P_{true}$connected, $ Pf_{true}$concave; Nondominated solutions are in the border of Feasible region Maximize $ F =

(f_1(\vec{x}), f_2(\vec{x}))$, where
$\displaystyle f_1(\vec{x})$ $\displaystyle =$ $\displaystyle - x_1^2 + x_2,$  
$\displaystyle f_2(\vec{x})$ $\displaystyle =$ $\displaystyle \frac{1}{2}x_1 + x_2 + 1,$  

subject to:
$\displaystyle \frac{1}{6}x_1 + x_2 - \frac{13}{2}$ $\displaystyle \leq$ 0  
$\displaystyle \frac{1}{2}x_1 + x_2 - \frac{15}{2}$ $\displaystyle \leq$ 0  
$\displaystyle \frac{5}{x_1} + x_2 - 30$ $\displaystyle \leq$ 0  
$\displaystyle 0 \leq x_{i}$ $\displaystyle \leq$ $\displaystyle 7$  
$\displaystyle i =$ $\displaystyle 1,$ $\displaystyle 2$  

kitavarkitafun
KITA Pareto Front (data file)

DTLZ1
Proposed by Deb et. al [2] Minimize $ F =

(f_1(\vec{x}), f_2(\vec{x}), f_3(\vec{x}))$, where
$\displaystyle f_{1}(\vec{x})$ $\displaystyle =$ $\displaystyle \frac{1}{2}x_{1}x_{2}(1+g(\vec{x})),$  
$\displaystyle f_{2}(\vec{x})$ $\displaystyle =$ $\displaystyle \frac{1}{2}x_{1}(1-x_{2})(1+g(\vec{x})),$  
$\displaystyle f_{2}(\vec{x})$ $\displaystyle =$ $\displaystyle \frac{1}{2}(1-x_{1})(1+g(\vec{x})).$  
$\displaystyle and$      
$\displaystyle g(\vec{x})$ $\displaystyle =$ $\displaystyle 100[10 + \sum\limits_{i=3}^{n}\left(x_i-0.5\right)^{2}$  
    $\displaystyle - \cos(20\pi(x_i - 0.5))].$  


$\displaystyle n$ $\displaystyle =$ $\displaystyle 12,$  
$\displaystyle 0\leq$ $\displaystyle x_{i}$ $\displaystyle \leq 1,$  
$\displaystyle i$ $\displaystyle =$ $\displaystyle 1,...,12$  

dtlz1adtlz1b
DTLZ1 Pareto Front (data file)

DTLZ7
Proposed by Deb et. al [2] Minimize $ F =

(f_1(\vec{x}), f_2(\vec{x}), f_3(\vec{x}))$, where
$\displaystyle f_{1}(\vec{x})$ $\displaystyle =$ $\displaystyle x_{1}$  
$\displaystyle f_{2}(\vec{x})$ $\displaystyle =$ $\displaystyle x_{2}$  
$\displaystyle f_{3}(\vec{x})$ $\displaystyle =$ $\displaystyle (1 + g(\vec{x})) \cdot h(f_1, f_2, g(\vec{x}))$  
$\displaystyle and:$      
$\displaystyle g(\vec{x})$ $\displaystyle =$ $\displaystyle 1 + \frac{9}{22} \cdot \sum\limits_{i=3}^{n}\left( x_i\right)$  
$\displaystyle h(f_1, f_2, g)$ $\displaystyle =$ $\displaystyle 3- \sum\limits_{i=1}^{2}\left( \frac{f_i}{1+g} (1 + sin(3\pi f_i))\right)$  


$\displaystyle n$ $\displaystyle =$ $\displaystyle 22,$  
$\displaystyle 0\leq$ $\displaystyle x_{i}$ $\displaystyle \leq 1,$  
$\displaystyle i$ $\displaystyle =$ $\displaystyle 1,...,12$  

dtlz7adtlz7b
DTLZ7 Pareto Front (data file)

References

1
Kalyanmoy Deb.
Multi-Objective Genetic Algorithms: Problem Difficulties and Construction of Test Problems.
Evolutionary Computation, 7(3):205-230, Fall 1999.
2
Kalyanmoy Deb, Lothar Thiele, Marco Laumanns, and Eckart Zitzler.
Scalable Test Problems for Evolutionary Multi-Objective Optimization.
Technical Report 112, Computer Engineering and Networks Laboratory (TIK), Swiss Federal Institute of Technology (ETH), Zurich, Switzerland, 2001.
3
Hajime Kita, Yasuyuki Yabumoto, Naoki Mori, and Yoshikazu Nishikawa.
Multi-Objective Optimization by Means of the Thermodynamical Genetic Algorithm.
In Hans-Michael Voigt, Werner Ebeling, Ingo Rechenberg, and Hans-Paul Schwefel, editors, Parallel Problem Solving from Nature--PPSN IV, Lecture Notes in Computer Science, pages 504-512, Berlin, Germany, September 1996. Springer-Verlag.
4
Frank Kursawe.
A variant of evolution strategies for vector optimization.
In H. P. Schwefel and R. Männer, editors, Parallel Problem Solving from Nature. 1st Workshop, PPSN I, volume 496 of Lecture Notes in Computer Science, pages 193-197, Berlin, Germany, oct 1991. Springer-Verlag.