Improved cross-entropy importance sampling
Abstract
In probabilistic reliability analysis, the probability of failure (PF) can be estimated using various sampling methods. To enhance computational efficiency and variance reduction in the probability of failure estimation, different importance sampling (IS) techniques have been developed. The cross-entropy method is an adaptive importance sampling method that approaches an optimal importance sampling density by fitting a parametric distribution to sample near or within the failure domain. Although this method has shown good performance when compared to classical IS methods, it only uses a fraction of the evaluated samples, hence dismissing most of the limit state function evaluations.
The improved cross entropy makes use of a dynamic, smooth indicator function approximation to improve computational efficiency. In this report, the improved cross entropy method is introduced and benchmarked for efficiency and performance against the standard cross entropy method and other importance sampling methods. As importance sampling density distribution families, Single-Gaussian and von Mises-
Fisher-Nakagami distributions are compared. The latter are particularly well suited for high-dimensional problems. The improved cross-entropy method shows comparable performance for the estimated mean and coefficient of variance of the PF, when compared to classical IS and the standard Single-Gaussian cross-entropy method. With the current implementation, the von Mises-Fisher-Nakagami distribution overall shows an inferior performance compared to the improved cross entropy method using Single-Gaussian densities.