Cross-Entropy Method - versatile Monte Carlo technique

The cross-entropy method  (2012)
Encyclopedia of Operations Research and Management Sciences,
Third edition, Springer-Verlag, 2012

Dirk P. Kroese, School of Mathematics and Physics, The University of Queensland, Brisbane 4072, Australia
Reuven Y. Rubinstein, Faculty of Industrial Engineering and Management,Technion, Haifa, Israel
Izack Cohen, Faculty of Industrial Engineering and Management, Technion, Haifa, Israel
Sergey Porotsky, ALD Ltd., Tel-Aviv 67137, Israel
Thomas Taimre, School of Mathematics and Physics, The University of Queensland, Brisbane 4072, Australia

The cross-entropy method is a powerful heuristic tool for solving difficult estimation and optimization problems, based on Kullback–Leibler (or cross-entropy) minimization. The cross-entropy (CE) method is a versatile Monte Carlo technique introduced by Rubinstein (1999; 2001), extending earlier work on variance minimization (Rubinstein 1997).
The CE method can be applied to two types of problems:
1. Estimation: Estimate ` = E[H(X)], where X is a random object taking values in some set X and H is a function on X. An important special case is the estimation of a probability ` = P(S(X) > ), where S is another function on X.
2. Optimization: Optimize (that is, maximize or minimize) S(x) over all x 2 X , where S is some objective function on X.
In the estimation setting, the CE method can be viewed as an adaptive importance sampling procedure that uses the cross-entropy or Kullback–Leibler divergence as a measure of closeness between two sampling distributions. In the optimization setting, the optimization problem is first translated into a rare-event estimation problem, and then the CE method for estimation is used as an adaptive algorithm to locate the optimum.