In many fields of society people are called upon to solve complex optimization problems. Optimization is the central theme of this thesis. More specifically, this thesis is about optimization by induction, where an algorithm makes educated guesses about what makes a good solution to a problem under study, building upon solutions that were evaluated earlier in the execution of the algorithm. In this thesis, we consider the class of evolutionary algorithms (EAs) as a specific class of optimization algorithms. By implementing certain abstractions of the processes observed in nature, EAs are designed to perform induction. Different designs lead to different EAs and consequently to different behavior in induction and optimization. In this thesis, we combine EAs with tools from probability theory and study the applicability and performance of these probability-based EAs, called iterated density estimation evolutionary algorithms (IDEAs), with respect to different types of optimization problem. The main contribution of IDEAs resides in the fact that by estimating a probability distribution over selected solutions and subsequently drawing new solutions from this distribution, an inductive tool is provided for online identification of features of the problem's structure. These features are then used to guide the search more efficiently towards the promising regions of the search space. IDEAs are relatively new to the field of evolutionary computation. In this thesis, we motivate their use. Moreover, we investigate the optimization performance of IDEAs in the fields of numerical optimization, permutation optimization and multi-objective optimization.