Abstract
The thesis deals with the new evolutionary paradigm based on the concept of Estimation of
Distribution Algorithms (EDAs) that use probabilistic model of promising solutions found so far to
obtain new candidate solutions of optimized problem.
There are six primary goals of this thesis:
l. Suggestion of a new formal description of EDA algorithm. This high level concept can be used to
compare the generality of various probabilistic models by comparing the properties of underlying
mappings. Also, some convergence issues are discussed and theoretical ways for further improvements
are proposed.
2. Development of new probabilistic model and methods capable of dealing with continuous
parameters. The resulting Mixed Bayesian Optimization Algorithm (MBOA) uses a set of decision
trees to express the probability model. Its main advantage against the mostly used IDEA and EGNA
approach is its backward compatibility with discrete domains, so it is uniquely capable of learning
linkage between mixed continuous-discrete genes. MBOA handles the discretization of continuous
parameters as an integral part of the learning process, which outperforms the histogram-based
approach. The original metrics for MBOA is derived as the incremental version of Bayesian Dirichlet
metrics (BDe). Its usefulness for modelling of Boolean functions is also investigated and confirmed.
3. Parallelization of EDA algorithms. Different versions of parallel EDA algorithms are proposed for
fine-grained. coarse-grained and multithreaded environment. All these algorithms use the original
principIe of restricted ordering of nodes in Bayesian network to minimize the communication between
parallel processes:
.The pipelined Parallel Bayesian Optimization algorithm (PBOA) is propased and simulated
for fine-grained type of parallelism. The target platform for its implementation can be
Transputers, or one-purpose MIMD processor.
.Distributed Bayesian optimization algorithm (DBOA) is proposed and implemented for
coarse-grained type of parallelism. The experiments are running on the uniform cluster of Sun
Ultra 5 workstations connected by fast Ethernet switch. Both PEGA and DBOA are based on
the parallelization of classical Pelikan 's BOA code.
.The multithreaded MBOA algorithm, which uses original MBOA code with mixed decision
trees model, was simulated in Transim tool.
In all parallel algorithms the additional progress towards reoltime performance can be achieved by
choosing the KBOA version of BOA algorithm. The concept of KBOA is based on the partial (local)
information about the linkage. This information is used to adjust the prior of probabilistic model and
injection of building blocks is used to improve the quality of initial population.
4. Extension of single-objective EDA to multi-objective EDA. The experiments show that the realized
Pareto-strength BOA algorithm, which utilizes a promising Pareto-strength niching technique,
outperforms the classical constraint- or weighting-based approach and is comparable to best
recombination-based multiobjective-algorithms of its time -SPEA and NSGA. A completely new
optimization technique for MBOA core was designed in collaboration with the author of SPEA2. This
new selection operator based on epsilon-archives leads to implementation of Bayesian Multi-objective
Optimization Algorithm (BMOA).
5. Design of the modular development system DEBOA for rapid prototyping of optimization
applications on the basis of BOA algorithm. The general requirements on the development system are
identified including modularity and visualization of results. The design of the development system
itself is realized on the platform of lava language, which ensures great portability. DEBOA system
can be easily extended to support new types of fitness functions as well as new types of visualizations.
6. Testing of the developed algorithms on well-known benchmarks as well as several real-world
problems, mainly from the area of circuit design.
Key Words: Bayesian network, Gaussian network, Estimation of Distribution Algorithm, Factorized
Distribution Algorithm, Bayesian Optimization Algorithm, Univariate Marginal Distribution
Algorithm, Bivariate Marginal Distribution Algorithm, Bayesian-Dirichlet metric, scoring metric,
dependency structure, machine learning, linkage learning, decision graph, kernel distribution, mixture
model, formal description, building block corruption, convergence, hypergraph bisectioning,
knapsack problem, additively decomposable function, multiobjective optimization, Parero ser, niching,
Pareto-strength fitness, epsilon dominance, pipelined processing, distributed processing,
multithreaded processing, scalability, rapid prototyping of BOA application.