Вы находитесь на странице: 1из 55

STOCHASTIC FINITE ELEMENTS

Roger Ghanem

University of Southern California


Los Angeles, California
USA

SHORT COURSE: UNICAMP, CAMPANIS - BRAZIL- MARCH 2nd 2007 -

Outline

 Motivation
 Mathematical structure for validation and verification
 Packaging of information using this structure:
representing experimental information
 Construction of approximation on these structures
 Numerical implementation of stochastic predictions (SFEM)
 Example applications
 Technical challenges and conclusions

Motivation

Excerpts from published emails in connection with shuttle Columbias last mission:

Excerpts from published emails in connection with shuttle Columbias last mission:

simulations showed that landing with 2 flat tires were survivable. Bob and David expressed some
skepticism as to the accuracy of the Ames sim in light of other data (Convair 990 testing)

Problem Definition

Consider man-woman:
he makes a proposition
She says:
I have
a coin: if !you know
Notflipped
interested
the outcome then YES
She says:
I will
flip ahas
coin:aif chance
you will know
man
the outcome then YES

Solution
Quantity of interest: decision with upper bound on risk

Flip coin 100 times


observe initial configuration
observe surface tension
---

calibrated decision tool


calibrated decision tool
calibrated decision tool
---

52-48
70-30
80-20
---

observe quantum states

calibrated decision tool

99-1

These
Useprobabilities
Model-Based
Predictions
in lieu of physical
experiments
are not
intrinsic properties
of the state.
They are
to save time
cost information.
conditioned
on and
available

Ingredients:


calibrate a stochastic plant (you must characterize/parameterize it first !)

evaluate limit on predictability of plant

refine plant if target confidence not achievable

Interaction of model and data

Interaction of model and data

Interaction of model and data

Interaction of model and data

Interaction of model and data

Error budget

We use probabilistic models for uncertainty


The probabilistic framework provides a packaging of information that is amenable to a level
of rigor in analysis that permits the quantification of uncertainty.

Although:
unacquainted with problems where wrong results could be attributed to failure to use
measure theory. (E.T. Jaynes, published 2003)

An experiment is defined by a probability triple, (a measurable space)


Set of elementary events
Sigma algebra of all events that make sense
Measure on all elements of

A random variable

is a measurable mapping on a probability space

The probability distribution


of
defining the probability triple

is the image of

under

to

What do we mean by things being close ?


Modes of Convergence: Let
be a sequence of r.v. defined on
another rv on same probability space:

and let

be

Almost sure Convergence:


Convergence in mean square:
Convergence in mean:
Convergence in probability:
Convergence in distribution:

Let
then

Convergence in mean-square permits an

and
converges in distribution to

if

analysis of random variables.

First step in a unified perspective in verification and validation.

converges to

What is a random variable

A random variable is a function:


function

Cameron Martin: Any second-order functional of the Brownian motion


can be expanded as a mean-square convergent series in terms of
infinite-dimensional Hermite polynomial in gaussian variables.

i.e.: A process that can be modeled as the output of a relatively nice


system to highly oscillatory input can be represented using the Wiener
Chaos: series of multi-dimensional Hermite polynomials in Gaussian
variables.
POLYNOMIAL CHAOS EXPANSION

Cameron-Martin Theorem

Another special basis: Karhunen-Loeve expansion

Reference: J.B. Read (1983)

ONE REALIZATION OF
FOAM PROPERTIES

KL MODES

Management of uncertainty

Management of uncertainty
COORDINATES IN THIS SPACE REPRESENT PROBABILISTIC CONTENT.

SENSITIVITY OF PROBABILISTIC STATEMENTS OF BEHAVIOR ON DATA.

Adapted bases

COORDINATES IN THIS SPACE REPRESENT PROBABILISTIC CONTENT

Adapted bases

COORDINATES IN THIS SPACE REPRESENT PROBABILISTIC CONTENT

Representing uncertainty

The random quantities are resolved as

These could be, for example:

Multidimensional Orthogonal
Polynomials in independent
random variables

Parameters in a PDE

Boundaries in a PDE (e.g. Geometry)

Field Variable in a PDE

Dimension of approximation
reflects the stochastic complexity
and heterogeneity of the process

First Step: Model Reduction with Karhunen-Loeve


Covariance matrix of observations
Starting with observations of process
over a limited subset of indexing set:

Reduced order representation: KL:

Where:

Representing uncertainty

 Galerkin Projection
 Maximum Likelihood Formulation
 Bayesian Inference
 Maximum Entropy

Representation of uncertainty: Galerkin projection

The random quantities are resolved as

These could be, for example:

Parameters in a PDE

Boundaries in a PDE (e.g. Geometry)

Field Variable in a PDE

These decompositions provide a resolution (or parameterization) of


the uncertainty on spatial or temporal scales
Reference: Sakamoto and Ghanem, PEM and JEM, 2000.

Simulation of non-gaussian processes


Marginal probability density function of the stochastic process is specified at
every point in the indexing set, along with the two-point correlation function.
Process is characterized and simulated through its Polynomial Chaos
coordinates.

Simulation of non-gaussian processes


Marginal probability density function of the stochastic process is specified at
every point in the indexing set, along with the two-point correlation function.
Process is characterized and simulated through its Polynomial Chaos
coordinates.

PDF: 1st order chaos

Realizations of the process

Simulation of non-gaussian processes


Marginal probability density function of the stochastic process is specified at
every point in the indexing set, along with the two-point correlation function.
Process is characterized and simulated through its Polynomial Chaos
coordinates.

PDF: 2nd order chaos

Realizations of the process

Simulation of non-gaussian processes


Marginal probability density function of the stochastic process is specified at
every point in the indexing set, along with the two-point correlation function.
Process is characterized and simulated through its Polynomial Chaos
coordinates.

PDF: 3rd order chaos

Realizations of the process

Simulation of non-gaussian processes


Marginal probability density function of the stochastic process is specified at
every point in the indexing set, along with the two-point correlation function.
Process is characterized and simulated through its Polynomial Chaos
coordinates.

PDF: 4th order chaos

Realizations of the process

Representation of uncertainty: Maximum likelihood

Reference: Descelliers, C., Ghanem, R. and Soize, C. ``Maximum likelihood estimation of stochastic chaos representation from experimental data,'' to appear in International Journal for
Numerical Methods in Engineering.

essential dimensionality of a process


Stochastic parameters

10

Physical object: Linear


Elasticity
z

x 10

2.5

1
0

0
2
4
6
8

2
10

1.5

1
12

3.25

6.5

9.75

13

Convergence of PDF

10

Convergence of PDF

Convergence as function of
dimensionality

10

10

5
4

0
8

10

0.5

1.5

2.5

(b)

0.5

1.5

2.5

1.5

2.5

10

(a)

(c)

(d)

10

0.5

1.5

2.5

0.5

Representation of uncertainty: Bayesian inference


Objective is to estimate
Polynomial Chaos representation of reduced variables:

Constraint on chaos coefficients:

Estimation of stochastic process using estimate of reduced variables:

Representation of uncertainty: Bayesian inference


Define Cost Function (hats denote estimators):

Then Bayes estimate is:

Bayes rule:
Use kernel density estimation to represent the Likelihood function
Use Markov Chain Monte Carlo to sample from the posterior
(metropolis Hastings algorithm) -->BIMH

Reference: Ghanem and Doostan, Journal of Computational Physics, 2006.

Characterization of Uncertainty:
Bayesian Inference

Posterior distributions of coefficients in


polynomial Expansion
of

Reference: Doostan and Ghanem, , Journal of Computational Physics, 2006.

Distribution of the recovered process:

Representation of uncertainty: MaxEnt and Fisher information


Maximum Entropy Density Estimation (MEDE) results in joint measure of KL variables:
Observe at N locations and n sets of observations.
Reduce dimensionality using KL.
Moments of observations as constraints:

Fisher Information Matrix:

Then asymptotically (h denotes coefficients in polynomial chaos description of observations):

Representation of uncertainty: MaxEnt and Fisher information

Reference: Das, Ghanem, and Spall, SIAM Journal on Scientific Computing, 2006.

Uncertainty modeling for system parameters


Approximate asymptotic representation:

Representation on the set of observation:

Remark: Both intrinsic uncertainty and uncertainty due to lack of data are represented.
Representation smoothed on the whole domain:

Remark:

is formulated by spectral decomposition of

Stochastic Prediction

Propagation of uncertainty from


parameters to prediction:

Approximation in the measure


space defined by the parameters
and the physics

Stochastic Prediction

A random variable is a function:


function

Characterize Y, given characterization of

Functional representations
Assumptions:
 Second-order random variables:

 Basic random vectors are independent:

 Vector

does not necessarily involve independent random variables:

Functional representations

Given bases of Hj,k other bases can be constructed.

Functional representations

Soize, C., and Ghanem, R. ``Physical Systems with Random Uncertainties: Chaos representations with arbitrary probability measure,'' SIAM Journal of Scientific Computing,
Vol. 26, No. 2, pp. 395-410, 2004.

Some common bases


Infinite-dimensional case:
This is an exercise in stochastic analysis:
Hermite polynomials: Gaussian measure (Wiener: Homogeneous Chaos)
Charlier polynomials: Poisson measure (Wiener: Discrete Chaos)
Very few extensions possible: Friedrichs and Shapiro (Integration of functionals) provide
characterization of compatible measures. Segall and Kailath provide an extension to martingales.

Finite-dimensional case: independent variables:


This is an exercise in one-dimensional approximation:
Askey polynomials:measures from Askey chart (Karniadakis and co-workers)
Legendre polynomials: uniform measure (theoretical results by Babuska and co-workers)
Wavelets: Le-Maitre and co-workers
Arbitrary measures with bounded support: C. Schwab

Finite-dimensional case: dependent variables:


This is an exercise in multi-dimensional approximation:
Arbitrary measures: Soize and Ghanem.

Stochastic Finite Elements


Variational Formulation:
Find

s.t.:

Where:

Notice:
should be coercive and continuous

Stochastic Finite Elements


Approximation Formulation:
Find

s.t.:

Where:

ith Homogeneous Chaos in


Write:

*
Basis in X

Basis in Y

Stochastic Finite Elements


Sources of Error:

Spatial Discretization Error:

Random Dimension Discretization Error:

 Joint error estimation is possible, for general measures, using nested approximating
spaces (e.g. hierarchical FEM) (Doostan and Ghanem, 2004-2007)

 Joint error estimation is possible, for special cases:


 infinite-dimesional gaussian measure: Benth et.al, 1998
 tensorized uniform measure: Babuska et.al, 2004

Stochastic Finite Elements


Approximation Formulation:
Find

s.t.:

Where:

ith Homogeneous Chaos in


Write:

*
Basis in X

Basis in Y

Typical System Matrix: 1st Order

Typical System Matrix: 2nd Order

Typical System Matrix: 3rd Order

Efficient pre-conditioners

Reference: Ghanem and Kruger, 1997; Ghanem and Pellissetti, 2003.

Non-intrusive implementation

Reference: Ghanem and Ghiocel, 1995; Ghanem and Red-Horse, 1999, 2000, 2001, 2002, 2003; Reagan et.al 2003, 2004, 2005; Soize and Ghanem, 2004.

Вам также может понравиться