🤖 AI Summary
In seismic fragility analysis of nuclear facilities, conventional frequentist methods—particularly maximum likelihood estimation (MLE) of probit-lognormal fragility curves—suffer from severe bias and likelihood degeneracy under extremely small binary-response samples, rendering them unreliable.
Method: This paper proposes a Bayesian sequential experimental design framework grounded in a constrained reference prior. It systematically identifies likelihood degeneracy as the fundamental limitation for small-sample fragility modeling and introduces, for the first time, a parameter-constrained reference prior that ensures both objectivity and numerical robustness, thereby preventing ill-conditioned posteriors. Fragility curves are estimated via sequential data acquisition and Bayesian updating.
Contribution/Results: Experiments demonstrate that with only 5–10 test samples, the proposed method reduces estimation error by over 60% compared to MLE, while significantly outperforming naïve Bayesian and conventional approaches in bias identification and quantification.
📝 Abstract
Seismic fragility curves express the probability of failure of a mechanical equipment conditional to an intensity measure derived from a seismic signal. Although based on a strong assumption, the probit-lognormal model is very popular among practitioners for estimating such curves, judging by its abundant use in the literature. However, as this model is likely to lead to biased estimates, its use should be limited to cases for which only few data are available. In practice, this involves having to resort to binary data which indicate the state of the structure when it has been subjected to a seismic loading, namely failure or non-failure. The question then arises of the choice of data that must be used to obtain an optimal estimate, that is to say the most precise possible with the minimum of data. To answer this question, we propose a methodology for design of experiments in a Bayesian framework based on the reference prior theory. This theory aims to define a so-called objective prior that favors data learning, which is slighty constrained in this work in order tackle the problems of likelihood degeneracy that are ubiquitous with small data sets. The novelty of our work is then twofold. First, we rigorously present the problem of likelihood degeneracy which hampers frequentist approaches such as the maximum likelihood estimation. Then, we propose our strategy inherited from the reference prior theory to build the data set. This strategy aims to maximize the impact of the data on the posterior distribution of the fragility curve. Our method is applied to a case study of the nuclear industry. The results demonstrate its ability to efficiently and robustly estimate the fragility curve, and to avoid degeneracy even with a limited number of experiments. Additionally, we demonstrate that the estimates quickly reach the model bias induced by the probit-lognormal modeling.