Вы находитесь на странице: 1из 7

Analytic Solution of Seismic Probabilistic Risk Assessment

O. Nusbaumer
Leibstadt Nuclear Power Plant (KKL), Leibstadt, Switzerland
Swiss Federal Institute of Technology (ETH), Zurich, Switzerland

ABSTRACT: Probabilistic seismic risk assessments (PSRA) are used to quantify the seismic damage probability of complex engineering structures due to seismic events. In such systems, the seismic capacities of individual relevant components are determined by structure mechanics. A significant digits computation
method to qua ntify the mean seismic failure probability (fragility) for given component capacity and seismic
intensity is proposed. The overall system failure probability is then calculated using standard fault tree / event
trees models. Most of the tools available to quantify such logical models implement established techniques
based on the rare event approximation. However, for high failure probabilities as seen in PSRAs, the rare
event approximation produces conservative results. A method based on binary decision diagrams (BDD) is
proposed to quantify such seismic model analytically. A simplified, representative binary decision diagram
model is developed in order to evaluate the impact of the rare event approximation on such models. This paper focuses on PSRAs in the nuclear industry.
1 INTRODUCTION

In this paper, the peak ground acceleration is chosen as ground motion level indicator.

1.1 Scope of a probabilistic seismic risk assessment


The scope of a probabilistic seismic risk assessment
(PSRA) is to address potential scenarios that could
be initiated by a seismic event. These scenarios
model the system response and they are developed
until either a safe condition is reached or until the
accident progression has reached a state where damage is imminent.
A PSRA addresses the range of postulated seismic event for different earthquake intensities and
addresses both seismically induced failures, as well
as random failures and human errors. It provides an
estimate of the frequency of accident sequences, and
it identifies and ranks the major contributors to the
top event probability.
1.2 Seismic hazard approach
The seismic hazard curve represents the site specific
seismicity and is developed by seismologists. It integrates the contribution of all possible earthquakes
for a specific site and is represented by the annual
probability of exceeding a given estimator of ground
motion level (Figure 1).
Such estimators include the peak ground acceleration (PGA), the average spectral acceleration or the
pseudo-relative velocity.

Figure 1. Example of seismic hazard curve.

1.3 Component seismic capacity


In complex engineering systems as in the nuclear industry, individual component seismic capacities are
developed.
The development process involves identifying
seismic relevant components and obtaining detailed
component information. A component capacity is
given by a triple (am , u, r), am being the median
ground motion capacity, u the uncertainty in capacity and r the randomness in the impact of the earthquake.

Capacity factors are derived from several sources


of information including plant-specific design reports, test reports, generic earthquake experience
data, and generic analytical derivations of capacity
based on governing codes and standards. Both structural and functional failure modes are considered in
developing capacity factors for equipment.

are thus expressed in terms of Boolean equations or


minimal cut sets, or prime implicants.
Seismic risk quantification is performed for several discrete ground motion levels. Those discrete
seismic risks are then summed up to obtain the overall seismic risk.

1.4 Component seismic fragility

2 SIGNIFICANT DIGIT COMPUTATION OF


THE FRAGILITIES

Fragility is defined as the conditional probability of


failure for a given ground motion level. Fragilities
are developed on a component-specific basis or for a
group of similar component s, considering component location. A typical fragility curve and its associated uncertainty range is shown in Figure 2.

2.1 Governing equation

Figure 2. Example of seismic fragility curve for a given component capacity (a m =0.92), including 5th percentile; median
(50th percentile); 95th percentile; mean.

ln( a ) + u 1 (Q)
am

F (a, Q) =

1.5 Correlation between failure modes


Many of the potential failure modes of safety-related
equipment are not considered to be completely independent. For instance, the collapse of a structure is
also expected to result in failure of the equipment
located in that structure. Some degree of response
correlation exists for all items and for all modes of
failure since they are all excited by the same seismic
event.

The fragility curves are expressed in terms of probability of failure as a function of the sustained
ground motion level. The variability for the fragility
curves is represented by two parameters. These parameters take into account the inherent randomness
of the capacity of a particular type of component
(r) and the uncertainty in the median capacity (u).
Such a formulation is important in order to separate
the effects of randomness and uncertainty in the
seismic risk.
The fragility F(a,Q) is represented by the following equation for a given component capacity (am , u,
r) (American Nuclear Society and National Institute of Electrical and Electronic Engineers 1983) (1):
(1)

where (x) = standard Gaussian cumulative function;


Q = confidence level (0..1); am = median groundacceleration capacity; u = uncertainty in capacity;
r = randomness in earthquake and effects; a =
sustained ground motion level.
The mean failure probability and its associated uncertainty distribution cannot be directly calculated.
Effective calculation of the mean value over the
whole range of postulated seismic ground motion
levels is discussed in the next section.

1.6 Accident sequence modeling


The event tree delineates the possible accident scenarios for the seismic event. The event tree functional logic is developed to address seismic-specific
aspects of the analysis. The event trees are coupled
with the system models through fault tree linking to
develop a comprehensive, integrated model.

2.2 Analytic solution of the seismic failure


probability distribution
2.2.1 Mean value
For a more efficient calculation of F, the Gaussian
cumulative function can be rewritten in the form:
x

t2

1
1
x
e 2 dt = (1 + Erf (
))
2
2
2

1.7 Accident sequence quantification

( x) =

The possible combinations of randomly and seismically induced failures are modeled using logical fault
tree and event tree structures. Accident sequences

where: Erf ( x) :=

(2)

2
2
e t dt
0

(3)

Isolating x, one obtains :


x=

(4)

2 InvErf ( 2 1)

where InvErf(x) = reciprocal function of Erf(x).

2 = ( F F (a)) 2 d ( a, F ) dF

(10)

A three-dimensional representation of the distrib ution (equation 7) is shown in Figure 3.

Combining equations (2) and (4) to (1) yields the


following equation for a given component capacity
triple (am , u, r):
2

a
ln( ) + u InvErf (2Q 1)

2
a
m

F (a , Q ) = 1 + 1 Erf
2
2

(5)

Isolating Q, one obtains:

2
a
ln( )
r InvErf ( 2 F 1)
2
am
1 + Erf

Q( a , F ) =
2

(6)

The fragility distribution (probability density) for a


given ground motion level has, per definition, the
following propriety:
d(a, F) =

dQ(a, F )
=
dF

ln( ) r InvErf(2F 1)
r
2
am

2
ExpInvErf (2F 1)

u
u

(7)

Figure 3 Three-dimensional representation of the distribution


equation d(a,F).

It appears that the different fragility distributions


(for different acceleration a) can be modeled with a
beta distribution with a high level of accuracy. The
beta distribution is shown below:
beta ( , , x) :=

( + )
x 1 (1 x) 1
( ) ( )

where d(a,F) = fragility distribution (probability


density) for a given earthquake ground motion level
a.
To calculate the mean value of the fragility distrib ution, one integrates over all dF:

where ( x) := ln x 1 ( 1) dt t x1 e t dt
0

dQ(a, F )
EW = F (a) = F d (a, F ) dF = F
dF
dF
0
0

(8)

where EW = mean value of the fragility distrib ution.


One can however avoid computing the derivative of
Q by integrating equation 8 by parts:
1

EW = F (a) = F
0

dQ(a, F )
dF =
dF

(9)

1
[F Q(a, F )] 10 Q(a, F ) dF dF = 1 Q(a, F ) dF
dF
0
0

2.2.2 Uncertainty distribution


In order to model the associated fragility uncertainty
distribution, the variance 2 is calculated using the
following equation:

(12)

By forcing the beta distribution to yield the mean


value and the varia nce of the distribution, one obtains the following expressions for EW and 2 (equation 13 and 14).
1

(11)

EW := x beta( , , x) dx =
0

2 = ( x EW)2 beta( , , x) dx =
0

(13)

( + ) ( + + 1)
2

(14)

where EW = mean value of d(a,F); s 2 = variance of


d(a,F).
Solving equations 13 and 14 for a and yields:
=

EW ( EW 2 EW + 2 )
2

EW 3 2 EW 2 + EW + EW 2 2
2

(15)

(16)

2.2.3 Numerical integration


The mean value EW and the variance s 2 are numerically integrated using equations 9 and 10, respectively.
Due to the stiff characteristics of Erf(x) and InvErf(x) at their respective boundaries, effective numerical computation of those functions has to be
considered.
The numerical computation of Erf(x) is based on
Moris algorithm (Mori 1983) which guarantees a
high precision over the interval [-4;4].
The numerical computation of InvErf(x) is based
on Strecok algorithm (Strecok 1968) for the inverse
normal cumulative distribution function and on the
Halley's rational method (Scavo, Thoo 1995) using
third order refinement. The algorithm guarantees
machine precision over the interval [-1;1].
Numerical integration of EW and 2 is then performed using the Runge-Kutta method; the shape parameter and of the corresponding beta distrib ution is then calculated us ing equations 15 and 16,
respectively.
3 ANALYTICAL FAULT TREE SOLUTION
USING BINARY DECISION DIAGRAMS

3.1 Limitation of the rare event approximation


The fault tree / event tree approach is widely used in
the industry as the underlying formalism of probabilistic risk assessment, including seismic risk assessments. Most of the tools available to quantify Boolean tree models implement established techniques
based on minimal cut sets and the rare event approximation.
With such tools, problems based on general probabilities, i.e. success and failure events, cannot be
solved exactly in many cases.
In addition, the inclusion-exclusion formula
(equation 17) is often approached up to a given
depth. In almost all fault tree / event tree codes, union gates are evaluated using first order approximation. In most probabilistic risk analysis the use of
these approximations are fully justified, since the
number of high probability events is limited.
Those problems become critical when events with
high failure probability are considered, as found in
seismic risk analysis.

3.2 Proposed quantification method


In order to overcome those deficiencies, an alternative approach to assess model with high event probabilities is discussed.
Binary decision diagrams (BDD) is one alternative modeling approach (Bryant 1986). The benefit
of applying this approach rather than established kinetic tree methods is that fault trees and event trees
can be quantified analytically. The BDD method is
particularly effective for fault tree with relatively
few dependent events.
3.2.1 Binary decision diagrams (BDD)
A binary decision diagram of a Boolean expression
is the compacted representation of its Shannon expansion.
A binary decision diagram is a rooted, directed
acyclic graph with one or two terminal nodes of outdegree zero labeled 0 (expression is false) or 1 (expression is true), and with a set of variable nodes u
(i.e. event conditions) of out-degree two. The two
outgoing edges of a node u are given by two logical
function low(u) (variable node is false) and high(u)
(variable node is true).
A binary decision diagram is ordered (OBDD) if
on all paths through the graph the variables (var) respect a given linear order x 0 <x 1 <<xn .
An OBDD is reduced (ROBDD) if:
- no two distinct nodes u and v have the same
variable name and low- and high-successor
(uniqueness):
var(u)=var(v), low(u)=low(v), high(u)=high(v)
u = v , and if
- no variable node u has identical low- and highsuccessor:
high (u ) low( u)
An example of binary decision diagram is shown
in Figure 4:

A1 ... Ap =

1 i p

1i1 <i 2 p

i1

Ai2 +... + ( 1) p 1 A1 ... Ap

(17)

Figure 4. BDD representation of the Boolean expression ( X 0 X 1 ) U ( X 0 X 2 ) U ( X 1 X 2 ) . The black


lines correspond to the high branch (event condition true),
while the doted lines correspond to the low branch (event condition false).

3.2.2 Fault tree to BDD conversion


Converting a fault tree structure to BDD is a dynamic and iterative operation that is performed by
traversing the trees in a depth first ma nner and by
building the resulting BDD structure while going up
(bottom up approach). Individual BDDs are combined together using dynamic and recursive programming.
Dependency, i.e. common cause failure groups, is
modeled using dedicated common cause failure fault
trees.
3.2.3 Limitation of the binary decision diagram
methodology
The size of the BDD representation of a Boolean expression heavily depends on the variable ordering
that is chosen. Finding an optimal order for a BDD
is an NP-complete problem.
Several heuristic methods based on expert knowledge have been developed for variable ordering.
Good heuristics have been proposed to deal with
multi- level logic networks, which can be used for
probabilistic safety assessments because fault trees
are such network (Coudert, Madre 1993).

power plants. It consists of an offsite power supply


(OS), two redundant pumps (PUMP1 and PUMP2),
two check valves (CHECK_V1 and CHECK_V2),
one common injection valve (INJ_V), a motor control center (MCC), and one emergency diesel ge nerator (DG), as shown in Figure 5.
50 kV
DG

MCC
M
M
M

Figure 5. Benchmark model (P&ID)

A corresponding fault tree representation is shown


in Figure 6. The top event fails to discharge water
is addressed.

3.2.4 BDDFT Computer code


A computer code has been developed to assess probabilistic models with high event probabilities using
binary decision diagrams. This code integrates the
seismic fragility equations of section 2. By defining
the seismic hazard curve, fragilities are automatically computed with their uncertainty parameters for
discrete ground motion levels. The system fault trees
are automatically converted to their corresponding
BDD structure. As a result, the top event probability
is numerically computed using the analytical prime
implicant solutions.
Different fault trees can be linked within the
model using transfer gates. The computer code
graphical user interface (GUI) features two visualization modes, one for the fault tree and one for its
BDD representation.
The code also includes dedicated quantification
modules for assessing other similar high event probability problems, for instance human reliability
analysis (HRA) or common cause failure (CCF).
Figure 6. Fault tree model for the benchmark model.

3.3 Seismic Benchmark


3.3.1 Benchmark model
A simplified, yet representative model is proposed in
order to assess the impact of the rare event approximation, as well as to validate the use of binary decision diagrams on probabilistic models with high
event probabilities.
The benchmark model is derived from standard
emergency core cooling systems as found in nuclear

The individual basic events probabilities and


seismic capacities are shown in Table 1.
Table
1. Basic event probability and seismic capacity.
____________________________________
Basic Event

Probability
Seismic
capacity
_________
_____________
(non-seismic)
a
r
u
m
___________________________________________
DG
1E-04
1.12 0.31 0.28
MCC
1E-07
0.89 0.30 0.27
INJ_V
1E-04
very high

CHECK_V1
8E-04
very high
CHECK_V2
8E-04
very high
PUMP1
1E-03
3.26 0.24 0.30
PUMP2
1E-03
3.26 0.24 0.30
OS
3E-02
0.30 0.25 0.50
___________________________________________

3.3.2 Seismic Fragilities


The seismic fragilities for the individual basic events
of Table 1 are calculated using the method described
in section 2. The following table shows the fragility
mean values for horizontal seismic peak ground accelerations (PGA) between 0.05 and 2.00g.
Table 2. Basic event fragilities for PGA between 0.05 and
2.00g
(fragility cutoff: 1.00E-12).
________________________________________
PGA [g]

Mean fragilities
______________________________
DG
MCC
PUMP1/2
OS
________________________________________
0.05
0
0
0
6.72E-04
0.10
7.57E-10 1.01E-08 0
2.47E-02
0.15
4.38E-07 3.79E-06 0
1.07E-01
0.20
1.54E-05 9.91E-05 0
2.34E-01
0.25
1.54E-04 8.05E-04 0
3.72E-01
0.30
7.84E-04 3.49E-03 1.18E-10
5.00E-01
0.35
2.65E-03 1.03E-02 1.79E-09
6.09E-01
0.40
6.82E-03 2.37E-02 1.59E-08
6.97E-01
0.45
1.45E-02 4.55E-02 9.57E-08
7.66E-01
0.50
2.67E-02 7.65E-02 4.31E-07
8.20E-01
0.60
6.75E-02 1.64E-01 4.72E-06
8.92E-01
0.70
1.30E-01 2.76E-01 2.92E-05
9.35E-01
0.80
2.10E-01 3.96E-01 1.23E-04
9.60E-01
0.90
3.00E-01 5.11E-01 3.96E-04
9.75E-01
1.00
3.93E-01 6.14E-01 1.04E-03
9.84E-01
1.20
5.66E-01 7.70E-01 4.62E-03
9.93E-01
1.40
7.03E-01 8.69E-01 1.39E-02
9.97E-01
1.60
8.03E-01 9.27E-01 3.19E-02
9.99E-01
1.80
8.72E-01 9.59E-01 6.10E-02
9.99E-01
2.00
9.17E-01
9.78E-01
1.02E-01
1
________________________________________________

3.3.3 Analytic prime implicants


The prime implicants of the proposed model are derived from the BDD structure (Figure 7). The BDD
structure is calculated by the BDDFT computer code
(see section 3.2.4).

Figure 7. Binary decision diagram (BDD) structure for the


benchmark model.

4 COMPARISON BETWEEN ANALYTICAL


AND FIRST ORDER APPROXIMATION
SOLUTIONS

4.1 Non-seismic assessment


The benchmark model was quantified using both the
analytical prime implicants and the first order approximation (non-seismic). The results are shown in
Table 3:
Table 3. Benchmark model: top event probability comparison
between the analytical and the rare-event approaches (nonseismic).
________________________________________
Quantification

Top
event probability
___________________________

(non-seismic)
________________________________________
Analytic
3.02E-02
First order
3.03E-02

Relative
error
0.28%
________________________________________

4.2 Seismic assessment


The benchmark model was quantified using both the
analytical prime implicants and using the first order
approximation, for discrete ground motion levels.
The results are shown as a function of the ground
motion level (peak ground acceleration) in Figure 8.

REFERENCES

100%
1.40

90%
80%
70%

1.00

60%
0.80

50%

0.60

40%
Analytic
First order
Relative error [%]

0.40
0.20
0.00
0.00

30%

Relative error [%]

Top event probability [-]

1.20

20%
10%
0%

0.50

1.00

1.50

2.00

American Nuclear Society and National Institute of Electrical


and Electronic Engineers 1983. PRA Procedures Guides
A Guide to the Performance of Probabilistic Risk Assessment for Nuclear Power Plants, NUREG/CR-2300. Was hington: U.S. Nuclear Regulatory Commission.
Bryant, R.E., 1986. Graph-Based Algorithms for Boolean
Function Manipulation, IEEE Transactions on Computers,
Volume 35, Number 8: 677-691.
Coudert O., Madre J.C., 1993. Fault Tree Analysis: 1020 Prime
Implicants and Beyond, Proc. of Annual Reliability and
Maintainability Syrnp.,:240-245.

Acceleration [g]

Figure 8. Benchmark model: top event probability comparison


between the analytical and the rare-event approaches for seismic peak ground acceleration between 0.05 and 2.00g.

Mori M., 1983. A method for evaluation of the error function


of real and complex variable with high relative accuracy,
Publ. Res. Inst. Math. Sci. 19 (1983): 1081-1094.
Scavo, T. R. & Thoo, J. B, 1995. On the Geometry of Halley's
Method. Amer. Math. Monthly 102 (1995): 417-426.

5 CONCLUSIONS
An analytical derivation to calculate the probability
of seismically induced component failure was developed, taking into account the sustained seismic
ground motion level, the median seismic capacity of
the component, and lo gnormal unit distributions for
the randomness of the seismic impact and for the
epistemic uncertainty in the determination of the
component fragility. Such a formulation is important
in order to separate the effects of randomness and
uncertainty in the seismic risk. Algorithm candidates
for a robust integration of the proposed formula have
been discussed. Fragility uncertainty can be modeled
using a beta uncertainty distribution, with both shaping parameter derived form the analytic formulation.
A computer code, BDDFT, has been deve loped.
BDDFT is a dual mode fault tree binary decision
diagram (BDD) quantifier. The analyst can develop
models in a standard fault tree window, while the
computer code automatically converts the corresponding Boolean model to a binary decision diagram structure. Such BDD-based computer codes
produce analytically correct prime implicants solution of Boolean models. They are required when
probabilistic models with high failure probabilities
are studied.
The impact of the rare event approximation as
used in several industrial applications is addressed.
The results showed that the impact of the approximation is minimal for standard probabilistic models
with relatively low event probabilities (e.g. high
component reliability). However, special attention
should be paid when analyzing models with high
event probabilities, as found in PSRA, human reliability analysis (HRA), severe accident management
(SAM), and common cause failure groups (CCF)
with high dependency. In such models, the impact of
the rare event approximation become non-negligible
and produces over-conservative results.

Strecok J., 1968. On the calculation of the inverse of the error


function, Math. Comp. 22 (1968): 144-158.

Вам также может понравиться