You are on page 1of 164

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Polynomial Chaos based uncertainty propagation


Intrusive and Non-Intrusive Methods

Bert Debusschere , Habib Najm, Khachik Sargsyan, Cosmin Safta

bjdebus@sandia.gov

Sandia National Laboratories, Livermore, CA

USC UQ Summer School

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Acknowledgement

Omar Knio Roger Ghanem Olivier Le Matre and many others ...

Duke University, Raleigh, NC University of Southern California, Los Angeles, CA LIMSI-CNRS, Orsay, France

This work was supported by:

US Department of Energy (DOE), Ofce of Advanced Scientic Computing Research (ASCR), Scientic Discovery through Advanced Computing (SciDAC) and Applied Mathematics Research (AMR) programs.

Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energys National Nuclear Security Administration under contract DE-AC04-94AL85000.

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

The Goals of this lecture

Provide broad introduction to Polynomial Chaos-based forward UQ Propagate uncertainties from inputs to model prediction Only parametric uncertainties covered here, no model uncertainty What you should take away from this lecture An understanding of Polynomial Chaos expansions to represent random variables An understanding of intrusive and non-intrusive uncertainty propagation methods An appreciation for some of the challenges and associated mitigation approaches in forward UQ While this overview is broad, it is denitely not exhaustive

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Outline
1 2 3 4 5

Introduction Spectral Representation of Random Variables Forward Propagation of Uncertainty Sensitivity Analysis Addressing Challenges for PCE-based Uncertainty Quantication Representing Arbitrary RVs Sparse Quadrature Approaches for High-Dimensional Systems Taking Advantage of Sparsity in the System Choice of Observables in Oscillatory / Long Time Horizon Systems Stochastic Domain Decomposition Approaches Data Decomposition Approaches Systems with Inherent Stochasticity Summary References

6 7

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Polynomial Chaos Expansions (PCEs)

Representation of random variables with PCEs Convergence Galerkin projection Basis types

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

One-Dimensional Hermite Polynomials


0 ( x ) k ( x ) = = 1 (1)k ex
2

/2

dk x 2 /2 e , dx k

k = 1, 2, . . .

1 (x ) = x ,

2 (x ) = x 2 1,

3 (x ) = x 3 3 x , . . .

The Hermite polynomials form an orthogonal basis over [, ] with respect to the inner product 1 i j 2

i (x )j (x )w (x )dx = ij i2

where w (x ) is the weight function w (x ) = ex Note that


x /2 e 2 2 2

/2

is the density of a standard normal random variable

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

One Dimensional Polynomial Chaos Expansion


Consider:
P

u=
k =0

u k k ( )

u : Random Variable (RV)

represented with 1D PCE


uk : PC coefcients (deterministic) k : 1D Hermite polynomial of

Prob. Dens. [-]

4 3 2 1 0.5 1.0

0 0.0

1.5

2.0

order k
: Gaussian RV

u = 0.5 + 0.21 ( ) + 0.12 ( )

A random quantity is represented with an expansion consisting of functions of random variables multiplied with deterministic coefcients
Set of deterministic PC coefcients fully describes RV Separates randomness from deterministic dimensions

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Multidimensional Hermite Polynomials

The multidimensional Hermite polynomial i (1 , . . . , n ) is a tensor product of i i i the 1D Hermite polynomials, with a suitable multi-index i = (1 , 2 , . . . , n ),
n

i (1 , . . . , n ) For example, 2D Hermite polynomials: i 0 1 2 3 4 5 ... p 0 1 1 2 2 2 ...

=
k =1

i (k )
k

i 1 1 2 2 1 1 1 2 2 2 1 ...

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Multidimensional Inner Products Orthogonality

i j

...

i ()j ()g (1 )g (2 ) g (n )d1 d2 dn

=
k =1

i (k )j (k ) = ij 2 i
k k

where, such that,


P

g ( )

e /2 2

u=
k =0

uk k

i u =
k =0

uk i k = ui 2 i

ui =

u i 2 i

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Multidimensional Polynomial Chaos Expansion


Consider:
P

u=
k =0

uk k (1 , . . . , n )

u : Random Variable (RV) represented with multi-D PCE uk : PC coefcients (deterministic) k : Multi-D Hermite polynomials up to order p i : Gaussian RV n: Dimensionality of stochastic space P + 1: Number of PC terms: P + 1 =
(n+p)! n!p!

The number of stochastic dimensions represents the number of independent inputs, degrees of freedom that affect the random variable u
E.g. one stochastic dimension per uncertain model parameter Contributions from each uncertain input can be identied Compact representation of a random variable and its dependencies
Debusschere SNL Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

More Formally: Probability Spaces and Random Variables

Let (, S, P) be a probability space is an event space S is a -algebra on P is a probability measure on (, S) Random variables are functions X : R with a measure

corresponding to their image:

if X 1 (A) S, then dene (A) = P(X 1 (A)). p (x ) = d /dx : the density of the random variable X (with respect to Expectation: f =
N

Lebesgue measure on R). f d =

f p(x ) dx

Let : R such that for i = 1, . . . , N each i : R, be a set of

random variables
S( ): -algebra generated by the set of random variables L2 (, S( ), P): Hilbert space of real-valued random variables dened on

(, S( ), P) with nite second moments [Ernst et al. 2011]

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Convergence Theorem
A random variable u () in L2 (, S( ), P) can be described by a Polynomial Chaos (PC) expansion in terms of: the innite-dimensional i.i.d. Gaussian basis = {i ()} i =1 ;

u ()

a0 0 +
i1 =1

ai1 1 (i1 ()) ai1 i2 2 (i1 (), i2 ())

+
i1 =1 i2 =1

+
i1 =1 i2 =1 i3 =1

ai1 i2 i3 3 (i1 (), i2 (), i3 ()) + . . .

where p is the Polynomial Chaos of order p, 0 = 1, and p (i1 , . . . , ip ) = (1)p e 2


1 T

1 T p e 2 i1 . . . ip

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Notes on the PC construction

The Polynomial Chaoses are by construction orthogonal with respect to the Gaussian probability measure They are thus identical with the corresponding multidimensional Hermite Polynomials The rst four PCs are given by 0 1 (i ) 2 (i1 , i2 ) 3 (i1 , i2 , i3 ) = = = = ... [R.G. Ghanem and P.D. Spanos, 1991] 1 i i1 i2 i1 i2 i1 i2 i3 i1 i2 i3 i2 i1 i3 i3 i1 i2

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

A more compact notation ... and nite dimensionality and order

An L2 random variable u (x , t , ) can be described by a PC expansion in terms of: Hermite polynomials k , k = 1, . . . , ; the associated innite-dimensional Gaussian basis {i ()} i =1 ; spectral mode strengths uk (x , t ), k = 1, . . . , . Truncated to nite dimension n and order p, the PC expansion for u is written as
P

u (x , t , )
k =0

uk (x , t )k (())
(n+p)! n!p!

where () = {1 (), , n ()}, and P + 1 =

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Generalized Polynomial Chaos


PC Type Gauss-Hermite Legendre-Uniform Gamma-Laguerre Beta-Jacobi Domain (, +) [1, 1] [0, +) [1, 1] Density w ( )
1 e 2 2 1 2 x e (+1) (1+) (1) 2+ +1 B (+1, +1)
2

Polynomial Hermite Legendre Laguerre Jacobi

Free parameters none none > 1 > 1, > 1

Inner product: i j

b a

i ( ) j ( ) w ( ) d

Wiener-Askey scheme provides a hierarchy of possible continuous PC

bases, see Xiu and Karniadakis, SISC, 2002.


Legendre-Uniform PC is a special case of Beta-Jacobi PC Beta-Jacobi PC allows tailored accuracy of the PC representation across the

domain Input parameter domain often dictates the most convenient choice of PC Polynomials functions can also be tailored to be orthogonal w.r.t.

chosen, arbitrary density


Debusschere SNL Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Postprocessing / Analysis

Moments Plotting PDFs of RVs represented with PCEs When is a PCE accurate enough?

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Moments of RVs described with PCEs


P

u=
k =0

uk k ( )

Expectation: u = u0 Variance 2

= =

(u u )2
P

(
k =1 P

uk k ( ))2
P

=
k =1 j =1 P P

uj uk j ( )k ( )

=
k =1 j =1 P

uj uk j ( )k ( )
2 uk k ( )2 k =1

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Plotting PDFs of RVs corresponding to PCEs

u=
k =0

uk k ( )

Analytical formula formula for PDF(u ) exists Involves polynomial root nding, and is hard to generalize to multi-D PCE is cheap to sample Brute-force sampling and bin samples into histogram Use Kernel Density Estimation (KDE) to get smoother PDF with fewer samples ui PDF(u ) = 1 Ns h
Ns

K
i =1

u ui h

K is the kernel, h is the bandwidth

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Comparison of histograms and KDE

Source Wikipedia, Drleft; licensed under the Creative Commons Attribution-ShareAlike 3.0 License

Bandwidth h needs to be chosen carefully to avoid oversmoothing

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

How do I know my PCE is converged?

Approximation error in PCE is topic of a lot of research Rules of thumb: Higher order PC coefcients should decay Increase order until results no longer change Not always fail-proof ...

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Propagation of Uncertain Inputs Represented with PCEs

Galerkin projection approaches project uncertain quantity onto space

covered by PC basis functions


Relying on orthogonality of basis functions

uk =

u k , 2 k

k = 0, . . . , P

Residual orthogonal to space of basis functions Two different approaches Intrusive: project governing equations Non-intrusive: project sampled model outpus

Collocation approaches Match PCE to random variable at chosen sample points

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Intrusive Spectral Stochastic UQ Formulation: ODE Example

Sample ODE with parameter : du = u dt Let be uncertain; introduce N (0, 1). Express and u using PCEs in :
P P

=
k =0

k k ( ),

u (t ) =
k =0

uk (t )k ( )

Substitute in ODE and apply a Galerkin projection on i ( ),

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Galerkin Projection on i ( )

d dt

uk (t )k ( ) =

p p ( )

uq (t )q ( )

k =0 P

p =0

q =0

k =0 P

duk (t ) k ( ) dt

=
p =0 q =0 P

p uq (t )p ( )q ( )
P

k =0 P

duk (t ) k ( )i ( ) dt duk (t ) k ( )i ( ) dt dui dt 2 i

=
p=0 q =0 P P

p uq (t )p ( )q ( )i ( )

=
p =0 q =0 P P

p uq (t ) p ( )q ( )i ( )

k =0

=
p =0 q =0

p uq p q i

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Resulting Spectral ODE system

(P + 1)-dimensional ODE system dui dt


P P

=
p =0 q =0

p uq Cpqi ,

i = 0, . . . , P

where Cpqi = p q i / 2 i The tensor Cpqi can be evaluated once and stored for any given PC order and dimension This tensor is sparse, i.e. many elements are zero

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

1D 4 -Order Cijk Example : Hermite polynomials

th

i j k 0 0 0 0 1 1 0 2 2 0 3 3 0 4 4 1 1 2 1 2 3 1 3 4 2 2 2 2 2 4 2 3 3 2 4 4 3 3 4 4 4 4

value 1 1 2 6 24 2 6 24 8 24 36 192 216 1728

Cijk = i j k / 2 k and, k 0 1 2 3 4 2 k 1 1 2 6 24 i j k j i k k i j = = = i k j = j k i = k j i

with other not-reported i j k zero

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

1D 4 -Order Cijk Example : Legendre polynomials

th

i j k 0 0 0 0 1 1 0 2 2 0 3 3 0 4 4 1 1 2 1 2 3 1 3 4 2 2 2 2 2 4 2 3 3 2 4 4 3 3 4 4 4 4

value 1 1/3 1/5 1/7 1/9 2/15 3/35 4/63 2/35 2/35 4/105 0.029 0.026 0.018

Cijk = i j k / 2 k and, k 0 1 2 3 4 2 k 1 1/3 1/5 1/7 1/9 i j k j i k k i j = = = i k j = j k i = k j i

with other not-reported i j k zero

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Pseudo-Spectral Construction1

w = u 2 v , Spectral: wi = = u 2 v
i P P

u=
k =0

u k k ,

similarly for & v

j uk ul vm j k l m i ,
j =0 k =0 l =0 m =0

i = 0, . . . , P

The corresponding tensor of basis product expectations becomes too

large to pre-compute and store Pseudo-Spectral: Project each PC product onto a (P + 1)-polynomial before proceeding further, thus:

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Pseudo-Spectral Construction2

= uv w

i = uv w

=
j =0 k =0 P P

uk vj k j i , j k j i , uk w
j =0 k =0 P P

i = 0, . . . , P

= uw w

i = uw w

i = 0, . . . , P

w = w

wi = w

=
j =0 k =0

j k j i , k w

i = 0, . . . , P

Aliasing errors Efciency, and convenience [Debusschere et al., SIAM J. Sci. Comp., 2005.]

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Intrusive propagation through non-polynomial functions


Addition, subtraction, and product allow (pseudo-)spectral evaluation of all polynomial functions How to propagate PC expansions ({uk } {vk }) through transcendental functions 1 v = , v = ln u , or v = eu u
Use local polynomial approximations, e.g. Taylor series
E.g. eu = 1 + u + u + u + . . . 1! 2! 3! Issues: Convergence issues High-order PC multiplications lead to aliasing Instabilities
2 3

Write system of equations for output Integration approach Borchardt-Gauss Algorithm: Arithmetic-Geometric Mean (AGM) series

[Debusschere et al., SISC 2005, McKale, Texas Tech, M.S. Thesis, 2011]

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Inversion and division can be done without Taylor series

Assume three stochastic variables u, v, and w

u v Mode k of the stochastic product w=


P P

vw = u

vw

=
m =0 l =0

Cklm vl wm = uk

System of P + 1 linear algebraic equations in wm with known uk and vl ,


P

Vkm =
l =0

Cklm vl ,

Vw = u

More robust than Taylor series expansion for 1/u What about the condition number of V ?

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Integration approach for non-polynomial functions

Consider the ODE

du dx

= u , with solution u = ex
x

ex can be obtained from

du

=
2

udx

e x e xo =

u dx
xo

Similarly for ex , and ln(x )

ex exo

=
xo

2xu dx ,

ln(x ) ln(xo ) =
xo

dx x

Agrees well with directly sampled pdf if PC order is high enough to

resolve pdf of solution

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

A More General Integration approach for irrational functions

P P To evaluate u (x ), x = k =0 xk k , u = k =0 u k k , use a deterministic IC xa such that u (xa ) is known = du /dx = f (u , x ); express u ... require: f is a rational function )k are found from the uk and xk coeffs ... ensures that (u evaluate the integral: P (x b )j (x a )j P

uk (xb ) uk (xa ) =
j =0
2

)i dxj Cijk (u
i =0

= u , 2xu , and 1/x resp. ok for ex , ex , and ln(x ), with u = u cos x but not for esin x , with u
CPU-intensive, only slightly more robust than Taylor series

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Pseudo-spectral overloading of operations

Construction allows for a general representation using pseudo-spectral

(PS) overloaded operations.


E.g. multiplication operation

w =uuv
Each deterministic function multiplication is transformed into a

corresponding polynomial chaos product Potential meta-code: take a general deterministic code function F (u ),

(u ) produce a pseudo-spectral stochastic function F


Possibility of transforming legacy deterministic code into corresponding

pseudo-spectral stochastic code.


UQToolkit: contains library of utilities for operations on random variables

represented with PCEs

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Uncertainty Quantication Toolkit (UQTk)

A library of C++ and Matlab functions for propagation of uncertainty

through computational models


Mainly relies on spectral Polynomial Chaos Expansions (PCEs) for

representing random variables and stochastic processes


Target usage: Rapid prototyping Algorithmic research Tutorials / Educational Version 1.0 released under the GNU Lesser General Public License Downloadable from http://www.sandia.gov/UQToolkit/

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

UQTk contents and development/release plans

Currently released (http://www.sandia.gov/UQToolkit/) C++ Tools for intrusive UQ with PCEs Under production, planned release Fall 2012 C++ Tools for non-intrusive UQ Matlab tools for intrusive and non-intrusive UQ Karhunen-Love decomposition Bayesian inference tools Many more examples and documentation Under development Adding support for multiwavelet based stochastic domain decompositions Support for arbitrary basis types

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

PCEs in the UQToolkit

// Initialize PC class int ord = 5; // Order of PCE int dim = 1; // Number of uncertain parameters PCSet myPCSet("ISP",ord,dim,"LU"); // Legendre-Uniform PCEs // Initialize PC class int ord = 5; // Order of PCE int dim = 1; // Number of uncertain parameters PCSet myPCSet("NISP",ord,dim,"LU"); // Legendre-Uniform PCEs

Currently support Wiener-Hermite, Legendre-Uniform, and

Gamma-Laguerre (limited), Jacobi-Beta (development version)


PCSet class initializes PC basis type and pre-computes information

needed for working with PC expansions

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Operations on PCEs in the UQToolkit

// PC coefficients in double* double* a = new double[npc]; double* b = new double[npc]; double* c = new double[npc]; // Initialization a[0] = 2.0; a[1] = 0.1; ... // Perform some arithmetic myPCSet.Subtract(a,b,c); myPCSet.Prod(a,b,c); myPCSet.Exp(a,c); myPCSet.Log(a,c);

// PC coefficients in Arrays Array1D<double> aa(npc,0.e0); Array1D<double> ab(npc,0.e0); Array1D<double> ac(npc,0.e0); // Initialization aa(0) = 2.0; aa(1) = 0.1; ... // Perform arithmetic myPCSet.Subtract(aa,ab,ac); myPCSet.Prod(aa,ab,ac); myPCSet.Exp(aa,ac); myPCSet.Log(aa,ac);

PC coefcients are either stored in double* vectors or in more

advanced custom Array1D<double> classes


Functions can take either data type as argument

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Surface Reaction Model


Species Mass Fractions [-]
1.0

3 ODEs for a monomer (u ), dimer (v ), and inert species (w ) adsorbing onto a surface out of gas phase. du = az cu 4duv dt dv = 2bz 2 4duv dt dw = ez fw dt z = 1uv w u (0) = v (0) = w (0) = 0.0

u
0.8 0.6 0.4 0.2 0.0 0 200 400

Time [-]

600

800

1000

Oscillatory behavior for b [20.2, 21.2] e = 0.36 f = 0.016

a = 1.6 b = 20.75 c = 0.04 d = 1.0

(Vigil et al., Phys. Rev. E., 1996; Makeev et al., J. Chem. Phys., 2002)

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Surface Reaction Model: Intrusive Spectral Propagation (ISP) of Uncertainty


Assume PCE for uncertain parameter b and for the output variables,

u, v , w
Substitute PCEs into the governing equations Project the governing equations onto the PC basis functions Multiply with k and take the expectation Apply pseudo-spectral approximations where necessary UQTk elementary operations

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Surface Reaction Model: Specify PCEs for inputs and outputs


Represent uncertain inputs with PCEs with known coefcients:
P

b=
i =0

bi i ( )

Represent all uncertain variables with PCEs with unknown coefcients:


P P P P

u=
i =0

ui i ( )

v=
i =0

vi i ( )

w=
i =0

wi i ( ) z =
i =0

zi i ( )

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Surface Reaction Model: Substitute PCEs into governing equations and project onto basis functions

du dt d dt k d dt
P P

= =

az cu 4duv
P P P P

ui i
i =0

a
i =0

zi i c
i =0 P

ui i 4d
i =0 P

ui i
j =0

vj j

ui i
i =0

a k
i =0 P

zi i

c k
i =0

ui i

4d k
i =0

ui i
j =0

vj j

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Surface Reaction Model: Reorganize terms

d uk 2 k dt d uk dt d uk dt

2 azk 2 k cuk k 4d i =0 j =0 P P

ui vj i j k i j k 2 k

azk cuk 4d
i =0 j =0 P P

ui vj

azk cuk 4d
i =0 j =0

ui vj Cijk

Triple products Cijk =

i j k 2 k

can be pre-computed and stored for

repeated use

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Surface Reaction Model: Substitute PCEs into governing equations and project onto basis functions

dv dt d dt k d dt
P P

= =

2bz 2 4duv
P P P P P

vi i
i =0

2
h =0

bh h
i =0 P

zi i
j =0 P

zj j 4d
i =0 P

ui i
j =0

vj j

vi i
i =0

2k
h =0 P

bh h
i =0 P

zi i
j =0

zj j

4d k
i =0

ui i
j =0

vj j

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Surface Reaction Model: Reorganize terms

d vk 2 k dt d vk dt d vk dt

2
h =0 i =0 j =0 P P P

bh zi zj h i j k 4d
i =0 j =0

ui vj i j k
P P

2
h =0 i =0 j =0 P P P

bh zi zj

h i j k 4d 2 k
P P

ui vj
i =0 j =0

i j k 2 k

2
h =0 i =0 j =0

bh zi zj Dhijk 4d
i =0 j =0

ui vj Cijk

Pre-computing and storing the quad product Dhijk becomes cumbersome Use pseudo-spectral approach instead

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Surface Reaction Model: Pseudo-Spectral approach for products


Introduce auxiliary variable g = z 2
P P

gk g f = 2bz 2 = = z2 2bg fk

=
i =0 j =0 P

zi zj Cijk
P

2
i =0 j =0

bi gj Cijk

Limits the complexity of computing product terms Higher products can be computed by repeated use of the same binary product rule Does introduce errors if order of PCE is not large enough

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Surface Reaction Model: UQTk implementation

// Build du/dt = a*z - c*u - 4.0*d*u*v aPCSet.Multiply(z,a,dummy1); aPCSet.Multiply(u,c,dummy2); aPCSet.SubtractInPlace(dummy1,dummy2); aPCSet.Prod(u,v,dummy2); aPCSet.MultiplyInPlace(dummy2,4.e0*d); aPCSet.Subtract(dummy1,dummy2,dudt);

// // // // // //

dummy1 dummy2 dummy1 dummy2 dummy2 dudt =

= a*z = c*u = a*z - c*u = u*v = 4.0*d*u*v a*z - c*u - 4.0*d*u*v

All operations are replaced with their equivalent intrusive UQ

counterparts
Results in a set of coupled ODEs for the PC coefcients u , v , w , z represent vector of PC coefcients This set of equations is integrated to get the evolution of the PC

coefcients in time

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Surface Reaction Model: Second equation implementation

// Build dv/dt = 2.0*b*z*z - 4.0*d*u*v aPCSet.Prod(z,z,dummy1); aPCSet.Prod(dummy1,b,dummy2); aPCSet.Multiply(dummy2,2.e0,dummy1); aPCSet.Prod(u,v,dummy2); aPCSet.MultiplyInPlace(dummy2,4.e0*d); aPCSet.Subtract(dummy1,dummy2,dvdt); // Build dw/dt = e*z - f*w aPCSet.Multiply(z,e,dummy1); aPCSet.Multiply(w,f,dummy2); aPCSet.Subtract(dummy1,dummy2,dwdt);

// // // // // //

dummy1 dummy2 dummy1 dummy2 dummy2 dvdt =

= z*z = b*z*z = 2.0*b*z*z = u*v = 4.0*d*u*v 2.0*b*z*z - 4.0*d*u*v

// dummy1 = e*z // dummy2 = f*w // dwdt = e*z - f*w

Dummy variables used where needed to build the terms in the equations Data structure is currently being enhanced to provide the operation

result as the function return value


Will allow more elegant inline replacement of operators with their stochastic

counterparts

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Surface Reaction Model: ISP results


Species Mass Fractions [-]
1.0

u
0.8 0.6 0.4 0.2 0.0 0 200 400

Time [-]

600

800

1000

Assume 0.5% uncertainty in b around nominal value Legendre-Uniform intrusive PC Mean and standard deviation for u , v , and w Uncertainty grows in time

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Surface Reaction Model: ISP results

0.5 0.4

u0 u1

u2 u3

u4 u5

[-]
ui

0.3 0.2 0.1 0.0 0.1 0 200 400

Time [-]

600

800

1000

Modes of u Modes decay with higher order Amplitudes of oscillations of higher order modes grow in time

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Surface Reaction Model: ISP results: PDFs

350 300

Prob. Dens. [-]

200 150 100 50 0 0.315 0.320 0.325 0.330 0.335 0.340 0.345 0.350 0.310

Prob. Dens. [-]

250

t = 330.5 t = 756.5

16 14 12 10 8 6 4 2 0 0.10

t = 377.8 t = 803.0

0.15

0.20

0.25

0.30

0.35

Pdfs of u at maximum mean (left) and maximum standard deviation

(right)
Distributions get broader and multimodal as time increases Effect of accumulating uncertainty in phase of oscillation

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Spectral UQ: Incompressible Flow - Stochastic Projection Method

(P + 1) Galerkin-Projected Mom./Cont. Eqns, q = 0, . . . , P :

v q + vv q t vq

= =

pq + 0

1 [(v ) + (v )T ] Re

Projection: for q = 0, . . . , P :

q v n v q t 2 pq
+1 q vn v q t

= = =

n n Cq + Dq

1 q v t

pq

P + 1 decoupled Poisson Eqns for the pressure modes

[Le Matre et al., J. Comp. Phys., 2001.]

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Laminar 2D Channel Flow with Uncertain Viscosity

Incompressible ow Gaussian viscosity PDF = 0 + 1 Streamwise velocity


P

v =
i =0

vi i

v0 : mean vi : i -th order mode


P

2 =
i =1

vi2 2 i

v0

v1

v2

v3

sd

v0

v1

v2

v3

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Non-Intrusive propagation of uncertainty - Projection


Galerkin Projection uk = u k 1 = 2 2 k k u k ( )w ( )d , k = 0, . . . , P

Evaluate projection integrals numerically


Pick samples of uncertain parameters, e.g. b ( ) by sampling the germ Run deterministic forward model for each of the sampled input

parameter values bi = b(i )


Integration depends on sampling approach
Random Sampling: u k = 1 Ns Quadrature: u k =
Nq i =1 Ns i =1

u (bi )k (i )

qi u (bi )k (i )

Reconstruct uncertain model output


P

u (x , t ; ) =
k =0

uk (x , t )k ( ())

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Random Sampling approaches

Evaluate integral through sampling

u k ( )w ( )d =

1 Ns

Ns

u (i )k (i )
i =1

Samples are drawn according to the distribution of Monte-Carlo (MC) Latin-Hypercube-Sampling (LHS) Pros: Can be easily made fault tolerant Sometimes random samples is all we have Cons: slow convergence, but less dependent on number of stochastic

dimensions

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Quadrature approaches
Numerically evaluate integrals in Galerkin projection
Nq

u k ( )w ( )d =
i =1

qi u (i )k (i )

Gauss quadrature rules are very efcient are quadrature points, with corresponding weights qi Nq quadrature points can integrate polynomial of order 2Nq 1 exactly Gauss-Hermite and Gauss-Legendre quadrature tailored to specic choices of the weight function w ( ) As a rule of thumb, p + 1 quadrature points are needed for Galerkin projection of PCE of order p
If both u and k are of order p, then integrand is of order 2p 2p 2Nq 1 or Nq p + 1 2 Only exact if u is indeed a polynomial of order p

Pros: Can use existing codes as black box to evaluate u (i ) Embarrassingly parallel
d Cons: Tensor product rule for d dimensions requires Nq samples

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Non-Intrusive propagation of uncertainty - Collocation

Collocation techniques minimize errors at sample points


P i = 1, . . . , Nc k =0 uk k (i ) = u (i ) , Can use interpolation, e.g. Lagrange interpolants Or use regression approaches: P + 1 degrees of freedom to t Nc points

Pros: can position points where most accuracy desired Cons:

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Surface Reaction Model


Species Mass Fractions [-]
1.0

3 ODEs for a monomer (u ), dimer (v ), and inert species (w ) adsorbing onto a surface out of gas phase. du = az cu 4duv dt dv = 2bz 2 4duv dt dw = ez fw dt z = 1uv w u (0) = v (0) = w (0) = 0.0

u
0.8 0.6 0.4 0.2 0.0 0 200 400

Time [-]

600

800

1000

Oscillatory behavior for b [20.2, 21.2] e = 0.36 f = 0.016

a = 1.6 b = 20.75 c = 0.04 d = 1.0

(Vigil et al., Phys. Rev. E., 1996; Makeev et al., J. Chem. Phys., 2002)

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Surface Reaction Model: NISP implementation in UQTk

Quadrature:
// Get the quadrature points int nQdpts=myPCSet.GetNQuadPoints(); double* qdpts=new double[nQdpts]; myPCSet.GetQuadPoints(qdpts); ... // Evaluate parameter at quad pts for(int i=0;i<nQdpts;i++){ bval[i]=myPCSet.EvalPC(b,&qdpts[i]); } ... // Run model for all samples for(int i=0;i<nQdpts;i++){ u_val[i] = ... } // Spectral projection myPCSet.GalerkProjection(u_val,u); myPCSet.GalerkProjection(v_val,v); myPCSet.GalerkProjection(w_val,w);

Monte-Carlo Sampling:
// Get the sample points int nSamples=1000; Array2D<double> samPts(nSamples,dim); myPCSet.DrawSampleVar(samPts); ... // Evaluate parameter at sample pts for(int i=0;i<nSamples;i++){ ... // select samPt from samPts bval[i]=myPCSet.EvalPC(b,&samPt) } ... // Run model for all samples for(int i=0;i<nSamples;i++){ u_val[i] = ... } // Spectral projection myPCSet.GalerkProjectionMC(samPts,u_val,u); myPCSet.GalerkProjectionMC(samPts,v_val,v); myPCSet.GalerkProjectionMC(samPts,w_val,w);

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Surface Reaction Model: NISP results

Species Mass Fractions [-]

u
0.8 0.6 0.4 0.2 0.0 0 200 400

Species Mass Fractions [-]

1.0

NISP Quadrature
v w

1.0 0.8 0.6 0.4 0.2 0.0 0 200

NISP MC
u v w

Time [-]

600

800

1000

400

Time [-]

600

800

1000

Mean and standard deviation for u , v , and w Quadrature approach agrees well with ISP approach using 6 quadrature

points
Monte Carlo sampling approach converges slowly With a 1000 samples, results are quite different from ISP and NISP

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Surface Reaction Model: Comparison ISP and NISP

0.004 0.002 0.000

[-]
ui

0.002 0.004 0.006 0.008 0.010 0


u4,ISP u4,NISP u5,ISP u5,NISP

200

400

Time [-]

600

800

1000

Lower order modes agree perfectly Very small differences in higher order modes Difference increases with time

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Surface Reaction Model: Comparison ISP and NISP


10

Prob. Dens. [-]

8 6 4 2 0 0.10 0.15

ISP, t = 803.0 NISP, t = 803.0

0.20

0.25

0.30

0.35

All pdfs based on 50K samples each and evaluated with Kernel Density

Estimation (KDE)
No difference in PDFs of sampled PCEs between NISP and ISP

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Surface Reaction Model: Comparison ISP, NISP, and MC


10

Prob. Dens. [-]

8 6 4 2 0 0.10 0.15

ISP, t = 803.0 NISP, t = 803.0 MC, t = 803.0

0.20

0.25

0.30

0.35

All pdfs based on 50K samples each and evaluated with Kernel Density

Estimation (KDE)
Good agreement between intrusive, non-intrusive projection, and Monte

Carlo sampling

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

ISP pros and cons

Pros: Elegant One time solution of system of equations for the PC coefcients fully characterizes uncertainty in all variables at all times Tailored solvers can (potentially) take advantage of new hardware developments Cons: Often requires re-write of the original code Reformulated system is factor (P+1) larger than the original system and can be challenging to solve Challenges with increasing time-horizon for ODEs Many efforts in the community to automate ISP UQToolkit: http://www.sandia.gov/UQToolkit/ Sundance: http://www.math.ttu.edu/~klong/Sundance/html/ Stokhos: http://trilinos.sandia.gov/packages/stokhos/ ...

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

NISP pros and cons


Pros: Easy to use as wrappers around existing codes Embarassingly parallel Cons: Most methods suffer from curse of dimensionality Nq = nNd Many development efforts for smarter sampling approaches and

dimensionality reduction
(Adaptive) Sparse Quadrature approaches Compressive Sensing ...

Sampling methods have found very wide spread use in the community DAKOTA: http://dakota.sandia.gov/ ...

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Sensitivity Analysis

Obtaining global sensitivity analysis from PCEs Identify dominant sources of uncertainty Attribution

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

PC postprocessing: global sensitivity information is readily obtained from PCE


K 1

g (x1 , . . . , xd ) =
k =0

c k k (x )

Global sensitivity analysis Variance decomposition Total variance 2 Var [g (x )] = ck ||k ||2
k >0

Main effect sensitivity indices Si = Var [E(g (x |xi )] = Var [g (x )]


k Ii k >0 2 ck ||k ||2 2 ck ||k ||2

Ii is the set of bases with only xi involved. Si is the uncertainty contribution that is due to i -th parameter only.

Joint sensitivity indices Sij = Var [E(g (x |xi , xj )] Si Sj = Var [g (x )]


k Iij k >0 2 ck ||k ||2 2 ck ||k ||2

Iij is the set of bases with only xi and xj involved. Sij is the uncertainty contribution that is due to (i , j ) parameter pair.
Debusschere SNL Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

PC postprocessing: sampling-based approaches


K 1

g (x1 , . . . , xd ) =
k =0

c k k (x )

In some cases, need to resort to Monte-Carlo estimation, e.g. Piecewise-PC with irregular subdomains Output transformations, e.g. build PC for log g (x ), but inquire sensitivity with respect to g (x ) A brute-force sampling of Var [E(g (x |xi )] is extremely inefcient. Tricks are available, given a single set of sampled input [Saltelli, 2002]. E.g., use E[g (x |xi )2 ] = E[g (x |xi )g (x |xi )] = 1 N 1
N

(r ) ) , g (x (r ) )g (x
r =1

is x with i -th element replaced by xi . where x Similar formulae available for joint sensitivity indices. Con: as all Monte-Carlo algorithms, converges slowly. Pro: sampling is cheap.
Debusschere SNL Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Challenges for PCE-based Uncertainty Quantication

Representing input variables with arbitrary distributions Systems with high-dimensional uncertainty Systems with long time horizon / oscillatory behavior Nonlinearities in governing equations for intrusive UQ Physical constraints in uncertain quantities Systems with non-smooth behavior discontinuities Systems with inherent stochasticity

Various approaches have been developed to tackle these challenges ...

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Obtaining PCEs for uncertain inputs

Characterizing PCEs for uncertain inputs is a really difcult problem Inputs specied in a variety of ways Probability density function Samples Expert opinion (e.g. about 3.5) Often obtained from inverse problem solution See session on Bayesian inference Generally provides many samples of the uncertain inputs

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Inverse CDF Mapping for 1D RVs

100

Posterior PDF Posterior PDF (PC)

PDF
50 0 20.73

20.74

20.75

20.76

20.77

20.78

Parameter b

Consider random variable a with CDF F () Either specied or constructed from samples with KDE CDF transformation F (a) = maps random variable a to uniform[0, 1]

random variable .
= ( ) maps uniform to normal RV The inverse CDF enables NISP projection
P

a=
k =0

ak k ( )

ak ak ( ) =

F 1 (( )) k ( )w ( )d
a

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Constructing an nD PCE for a RV with a given PDF

Given RV z R with PDF: g (z ), dene:


P

z=
i =0

zi i (1 , 2 , . . . , n ),

P +1=

(n + p)! n!p!

No general procedure Can choose {n, p} and the mode strengths by ensuring accurate capture of the PDF g (z ) select moments of z some observable of interest (z )

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Multivariate Normal Approximation

Many distributions are unimodal and somewhat shaped like Gaussians MultiVariate Normal (MVN) approximations capture the mean and

correlation structure of the random variables


Easy to extract from a set of samples In 1D: just compute mean and standard deviation: u = u0 + u1 Multi-D: Cholesky factorization of covariance C u = = LLT L

# Compute mean parameter values par_mean = numpy.mean(samples,axis=0) # Compute the covariance par_cov = numpy.cov(samples,rowvar=0) # Compute the Cholesky Decomposition chol_lower = numpy.linalg.cholesky(par_cov)

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

MVN approximation of Bayesian posterior from MCMC samples


Comparison of Posterior (blue) with MVN (red)

5400 5400 0.600 0.525 5350 0.450 0.375 5380 5360 5340

CS

5300

0.300 0.225 0.150

5250

0.075 0.000

5200 1.30

1.32

1.34

S1

1.36

1.38

1.40

CS

5320 5300 5280 5260 5240 1.31 1.32 1.33 1.34 1.35 1.36 1.37 1.38 1.39

S1

S1 CS

= =

1.351 + 0.013671 5310 26.251 + 20.262

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Rosenblatt for Multi-D RVs


Assume samples of multi-D RVs are available (e.g. from MCMC

sampling of posterior parameter distribution obtained with Bayesian inference)


Rosenblatt transformation maps any (not necessarily independent) set of

random variables (1 , . . . , n ) to uniform i.i.d.s {i }n i =1 (Rosenblatt, 1952). 1 2 . . . n mapping.


Conditional CDFs are harder to evaluate in high dimensions

= =

F1 (1 ) F2|1 (2 |1 )

Fn|n1,...,1 (n |n1 , . . . , 1 )

Rosenblatt transformation is a multi-D generalization of 1D CDF

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Projection of Rosenblatt transformed vars onto PCEs


1

20
0.8

15

Parameter 2

Parameter b

0.6

0.4

10

0.2

5 0.6 0.8 1 1.2 1.4 1.6


0 0 0.2 0.4 0.6 0.8 1

Parameter a

Parameter 1

NISP projection is enabled by inverse Rosenblatt transformation (a, b) = R 1 (1 , 2 )


ensures a well-dened quadrature integration
P 1 a

=
k =0 P

ak k ( )

ak

Ra ( ) k ( )w ( )d

=
k =0

bk k ( )

bk

1 Rb ( ) k ( )w ( )d b

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Karhunen-Love (KL) Expansions


Assume stochastic process F (x , ) : D R an L2 random eld on D With covariance function Cov (x , y ) F can be written as

F (x , ) = F (x , )
fk (x ): eigenfunctions of Cov (x , y )

+
k =1

k fk (x )k

k : corresponding eigenvalues, all positive k : uncorrelated random variables, unit variance Samples are obtained by projecting realizations of F onto fk Generally not independent
Special case: for Gaussian F , k are i.i.d. normal random variables

The KLE is optimal : of all possible orthonormal bases for L2 ( D ) the

above {f (x )} minimize the mean-square error in a nite linear representation of F ().

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

KL Expansions - Numerical Approach - 1


Covariance Matrix, Cov (x , y ) = F (x , )F (y , ) : specied analytically estimated from samples Estimate eigenvalues and eigenvectors for the Fredholm equation of

second kind: Cov (x , y )f (y )dy = f (x )


...using the Nystrom algorithm:
Np

wi Cov (x , yi )f (yi ) = f (x )
i =1

where wi are the weights for the quadrature rule that uses Np points yi where realizations are provided. Further manipulation leads to the eigenvalue problem Ag = g where A = WKW and g = Wf , with W being the diagonal matrix Wii = and Kij = Cov (xi , yj ). Solutions consist of pairs of eigenvalues k and eigenmodes fk = W 1 gk . wi

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

KL Expansions - Numerical Approach - 2


Samples of random variables k are obtained by projecting realizations

of the random process F on the eigenmodes fk k |l = F (x , l ) F (x , )


... or numerically
Np

, f k (x )

k |l =
i =1

wi F (xi , l ) F (xi , )

fk (xi )/

If Gaussian process: automatically have rst order WH PCE If not, same approaches as for converting RVs to PCEs applied to KL

RVs

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

1D Gaussian Process: Realizations


= 0.1
20 20

= 0.2

10

10

f(x)

f(x)
0.2 0.4 0.6 0.8 1.0

10

10

20 0.0

20 0.0

0.2

0.4

0.6

0.8

1.0

Covariance Cov (x1 , x2 ) = exp ((x1 x2 )2 / 2 ) Sample realizations are noisier as correlation length decreases

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

1D Gaussian Process: KL modes


= 0.1
3 2 1 0 1 2 3 0.0 0.2 0.4
f1 f2 f3 f4

= 0.2
4 3 2 1 0 1 2 3 1.0 4 0.0 0.2 0.4
f1 f2 f3 f4

fn

0.6

0.8

fn

0.6

0.8

1.0

Eigenmodes of the covariance matrix Data covariance matrix constructed from 4096 Gaussian process realizations Higher modes are more oscillatory

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

1D Gaussian Process: KL random variables


= 0.1
0.45 0.40 0.35 0.30

= 0.2

PDF()

0.25

0.20 0.15 0.10 0.05 0.00 4

( )
0

PDF()

1 2 3 4

0.45 0.40 0.35 0.30 0.25

1 2 3 4

0.20 0.15 0.10 0.05 0.00 4

( )
0

Random variables obtained by projecting realizations onto KL modes Uncorrelated by construction Also independent due to nature of Gaussian Process

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

1D Gaussian Process: Eigenvalue spectrum


102

Eigenvalue Magnitude

=0.05 =0.1 =0.2 =0.5

100 10-2 10-4 10-6 10-8 0 10 20

Eigenvalue #

30

40

60

Eigenvalue spectrum decays more slowly as correlation length

decreases
More oscillatory modes needed to represent uctuations in x

KL expansion generally is truncated after enough modes are included to

capture a specied fraction of the total variance


Debusschere SNL Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

1D Gaussian Process: Reconstructed realizations


102

Eigenvalue Magnitude

=0.05 =0.1 =0.2 =0.5

100 10 10
-2 -4

10-6 10-8 0 10 20

Eigenvalue #

30

40

60

Large scale features can be resolved with small number of modes Smaller scale features require higher modes

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

1D Gaussian Process: Reconstructed realizations


102
20
=0.05 =0.1 =0.2 =0.5

Eigenvalue Magnitude

2 terms

100 10-2 10-4 10-6 10


-8

10 0 10 20 0.0

10

20

Eigenvalue #

30

40

60

fn

0.2

0.4

0.6

0.8

1.0

Large scale features can be resolved with small number of modes Smaller scale features require higher modes

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

1D Gaussian Process: Reconstructed realizations


102
20
=0.05 =0.1 =0.2 =0.5

Eigenvalue Magnitude

4 terms

100 10-2 10-4 10-6 10


-8

10 0 10 20 0.0

10

20

Eigenvalue #

30

40

60

fn

0.2

0.4

0.6

0.8

1.0

Large scale features can be resolved with small number of modes Smaller scale features require higher modes

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

1D Gaussian Process: Reconstructed realizations


102
20
=0.05 =0.1 =0.2 =0.5

Eigenvalue Magnitude

6 terms

100 10-2 10-4 10-6 10


-8

10 0 10 20 0.0

10

20

Eigenvalue #

30

40

60

fn

0.2

0.4

0.6

0.8

1.0

Large scale features can be resolved with small number of modes Smaller scale features require higher modes

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

1D Gaussian Process: Reconstructed realizations


102
20
=0.05 =0.1 =0.2 =0.5

Eigenvalue Magnitude

8 terms

100 10-2 10-4 10-6 10


-8

10 0 10 20 0.0

10

20

Eigenvalue #

30

40

60

fn

0.2

0.4

0.6

0.8

1.0

Large scale features can be resolved with small number of modes Smaller scale features require higher modes

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

1D Gaussian Process: Reconstructed realizations


102
20
=0.05 =0.1 =0.2 =0.5

Eigenvalue Magnitude

10 terms

100 10-2 10-4 10-6 10


-8

10 0 10 20 0.0

10

20

Eigenvalue #

30

40

60

fn

0.2

0.4

0.6

0.8

1.0

Large scale features can be resolved with small number of modes Smaller scale features require higher modes

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

1D Gaussian Process: Reconstructed realizations


102
20
=0.05 =0.1 =0.2 =0.5

Eigenvalue Magnitude

14 terms

100 10-2 10-4 10-6 10


-8

10 0 10 20 0.0

10

20

Eigenvalue #

30

40

60

fn

0.2

0.4

0.6

0.8

1.0

Large scale features can be resolved with small number of modes Smaller scale features require higher modes

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

1D Gaussian Process: Reconstructed realizations


102
20
=0.05 =0.1 =0.2 =0.5

Eigenvalue Magnitude

16 terms

100 10-2 10-4 10-6 10


-8

10 0 10 20 0.0

10

20

Eigenvalue #

30

40

60

fn

0.2

0.4

0.6

0.8

1.0

Large scale features can be resolved with small number of modes Smaller scale features require higher modes

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

1D Gaussian Process: Reconstructed realizations


102
20
=0.05 =0.1 =0.2 =0.5

Eigenvalue Magnitude

18 terms

100 10-2 10-4 10-6 10


-8

10 0 10 20 0.0

10

20

Eigenvalue #

30

40

60

fn

0.2

0.4

0.6

0.8

1.0

Large scale features can be resolved with small number of modes Smaller scale features require higher modes

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

1D Gaussian Process: Reconstructed realizations


102
20
=0.05 =0.1 =0.2 =0.5

Eigenvalue Magnitude

1 terms

100 10-2 10-4 10-6 10


-8

10 0 10 20 0.0

10

20

Eigenvalue #

30

40

60

fn

0.2

0.4

0.6

0.8

1.0

Large scale features can be resolved with small number of modes Smaller scale features require higher modes

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

1D Gaussian Process: Reconstructed realizations


102
20
=0.05 =0.1 =0.2 =0.5

Eigenvalue Magnitude

2 terms

100 10-2 10-4 10-6 10


-8

10 0 10 20 0.0

10

20

Eigenvalue #

30

40

60

fn

0.2

0.4

0.6

0.8

1.0

Large scale features can be resolved with small number of modes Smaller scale features require higher modes

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

1D Gaussian Process: Reconstructed realizations


102
20
=0.05 =0.1 =0.2 =0.5

Eigenvalue Magnitude

3 terms

100 10-2 10-4 10-6 10


-8

10 0 10 20 0.0

10

20

Eigenvalue #

30

40

60

fn

0.2

0.4

0.6

0.8

1.0

Large scale features can be resolved with small number of modes Smaller scale features require higher modes

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

1D Gaussian Process: Reconstructed realizations


102
20
=0.05 =0.1 =0.2 =0.5

Eigenvalue Magnitude

4 terms

100 10-2 10-4 10-6 10


-8

10 0 10 20 0.0

10

20

Eigenvalue #

30

40

60

fn

0.2

0.4

0.6

0.8

1.0

Large scale features can be resolved with small number of modes Smaller scale features require higher modes

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

1D Gaussian Process: Reconstructed realizations


102
20
=0.05 =0.1 =0.2 =0.5

Eigenvalue Magnitude

5 terms

100 10-2 10-4 10-6 10


-8

10 0 10 20 0.0

10

20

Eigenvalue #

30

40

60

fn

0.2

0.4

0.6

0.8

1.0

Large scale features can be resolved with small number of modes Smaller scale features require higher modes

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

1D Gaussian Process: Reconstructed realizations


102
20
=0.05 =0.1 =0.2 =0.5

Eigenvalue Magnitude

6 terms

100 10-2 10-4 10-6 10


-8

10 0 10 20 0.0

10

20

Eigenvalue #

30

40

60

fn

0.2

0.4

0.6

0.8

1.0

Large scale features can be resolved with small number of modes Smaller scale features require higher modes

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

1D Gaussian Process: Reconstructed realizations


102
20
=0.05 =0.1 =0.2 =0.5

Eigenvalue Magnitude

7 terms

100 10-2 10-4 10-6 10


-8

10 0 10 20 0.0

10

20

Eigenvalue #

30

40

60

fn

0.2

0.4

0.6

0.8

1.0

Large scale features can be resolved with small number of modes Smaller scale features require higher modes

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

1D Gaussian Process: Reconstructed realizations


102
20
=0.05 =0.1 =0.2 =0.5

Eigenvalue Magnitude

8 terms

100 10-2 10-4 10-6 10


-8

10 0 10 20 0.0

10

20

Eigenvalue #

30

40

60

fn

0.2

0.4

0.6

0.8

1.0

Large scale features can be resolved with small number of modes Smaller scale features require higher modes

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

1D Gaussian Process: Reconstructed realizations


102
20
=0.05 =0.1 =0.2 =0.5

Eigenvalue Magnitude

9 terms

100 10-2 10-4 10-6 10


-8

10 0 10 20 0.0

10

20

Eigenvalue #

30

40

60

fn

0.2

0.4

0.6

0.8

1.0

Large scale features can be resolved with small number of modes Smaller scale features require higher modes

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

KL of 2D Gaussian Process

= 0.1
1.0 0.8 0.6 y y 0.4 0.2 0.0 0.0 1.0 0.8 0.6 y y 0.4 0.2 0.0 0.0 1.0 0.8 0.6 0.4 0.2 0.0 0.0 1.0 0.8 0.6 0.4 0.2 0.0 0.0

= 0.2
1.0 0.8 0.6 y 0.4 0.2 0.0 0.0 1.0 0.8 0.6 y 0.4 0.2 0.0 0.0

= 0.5

0.2

0.4

0.6

0.8

1.0

0.2

0.4

0.6

0.8

1.0

0.2

0.4

0.6

0.8

1.0

0.2

0.4

0.6

0.8

1.0

0.2

0.4

0.6

0.8

1.0

0.2

0.4

0.6

0.8

1.0

2D Gaussian Process with covariance:

Cov (x1 , x2 ) = exp(||x1 x2 ||2 / 2 )


Realizations are smoother as covariance length increases

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

2D KL - Modes for = 0.1


1 f1
1.0 0.8 0.6 1.0 0.8 0.6

2 f2
1.0 0.8 0.6

3 f3
1.0 0.8 0.6

4 f4

0.4 0.2 0.0 0.0 0.2 0.4 0.6 0.8 1.0

0.4 0.2 0.0 0.0 0.2 0.4 0.6 0.8 1.0

0.4 0.2 0.0 0.0 0.2 0.4 0.6 0.8 1.0

y
0.4 0.2 0.0 0.0 0.2 0.4 0.6 0.8 1.0

5 f5
1.0 0.8 0.6 1.0 0.8 0.6

6 f6
1.0 0.8 0.6

7 f7
1.0 0.8 0.6

8 f8

0.4 0.2 0.0 0.0 0.2 0.4 0.6 0.8

0.4 0.2 0.0 1.0 0.0 0.2 0.4 0.6 0.8

0.4 0.2 0.0 0.0 1.0 0.2 0.4 0.6 0.8

y
0.4 0.2 0.0 1.0 0.0 0.2 0.4 0.6 0.8 1.0

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

2D KL - Modes for = 0.2


1 f1
1.0 0.8 0.6 1.0 0.8 0.6

2 f2
1.0 0.8 0.6

3 f3
1.0 0.8 0.6

4 f4

0.4 0.2 0.0 0.0 0.2 0.4 0.6 0.8 1.0

0.4 0.2 0.0 0.0 0.2 0.4 0.6 0.8 1.0

0.4 0.2 0.0 0.0 0.2 0.4 0.6 0.8 1.0

y
0.4 0.2 0.0 0.0 0.2 0.4 0.6 0.8 1.0

5 f5
1.0 0.8 0.6 1.0 0.8 0.6

6 f6
1.0 0.8 0.6

7 f7
1.0 0.8 0.6

8 f8

0.4 0.2 0.0 0.0 0.2 0.4 0.6 0.8

0.4 0.2 0.0 1.0 0.0 0.2 0.4 0.6 0.8

0.4 0.2 0.0 0.0 1.0 0.2 0.4 0.6 0.8

y
0.4 0.2 0.0 1.0 0.0 0.2 0.4 0.6 0.8 1.0

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

2D KL - Modes for = 0.5


1 f1
1.0 0.8 0.6 1.0 0.8 0.6

2 f2
1.0 0.8 0.6

3 f3
1.0 0.8 0.6

4 f4

0.4 0.2 0.0 0.0 0.2 0.4 0.6 0.8 1.0

0.4 0.2 0.0 0.0 0.2 0.4 0.6 0.8 1.0

0.4 0.2 0.0 0.0 0.2 0.4 0.6 0.8 1.0

y
0.4 0.2 0.0 0.0 0.2 0.4 0.6 0.8 1.0

5 f5
1.0 0.8 0.6 1.0 0.8 0.6

6 f6
1.0 0.8 0.6

7 f7
1.0 0.8 0.6

8 f8

0.4 0.2 0.0 0.0 0.2 0.4 0.6 0.8

0.4 0.2 0.0 1.0 0.0 0.2 0.4 0.6 0.8

0.4 0.2 0.0 0.0 1.0 0.2 0.4 0.6 0.8

y
0.4 0.2 0.0 1.0 0.0 0.2 0.4 0.6 0.8 1.0

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

2D KL - eigenvalue spectrum
= 0.1 4 terms
1.0 0.8 0.6 y 0.4 0.2 y 0.4 0.2 0.2 0.4 0.6 0.8 1.0 0.0 0.0 0.2 0.4 0.6 0.8 1.0 0.0 0.0 x

16 terms
1.0 0.8 0.6

Eigenvalue Magnitude

104

10

100

=0.1 =0.2 =0.5

32 terms
1.0 0.8 0.6 y 0.4 0.2 0.0 0.0 0.2 0.4 0.6 0.8 1.0 y

64 terms
1.0 0.8 0.6 0.4 0.2 0.0 0.0 0.2 0.4 0.6 0.8 1.0

10-2 0

Eigenvalue #

20

40

60

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

2D KL - eigenvalue spectrum
= 0.2 4 terms
1.0 0.8 0.6 y 0.4 0.2 y 0.4 0.2 0.2 0.4 0.6 0.8 1.0 0.0 0.0 0.2 0.4 0.6 0.8 1.0 0.0 0.0 x

16 terms
1.0 0.8 0.6

Eigenvalue Magnitude

104

10

100

=0.1 =0.2 =0.5

32 terms
1.0 0.8 0.6 y 0.4 0.2 0.0 0.0 0.2 0.4 0.6 0.8 1.0 y

64 terms
1.0 0.8 0.6 0.4 0.2 0.0 0.0 0.2 0.4 0.6 0.8 1.0

10-2 0

Eigenvalue #

20

40

60

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

2D KL - eigenvalue spectrum
= 0.5 4 terms
1.0 0.8 0.6 y 0.4 0.2 y 0.4 0.2 0.2 0.4 0.6 0.8 1.0 0.0 0.0 0.2 0.4 0.6 0.8 1.0 0.0 0.0 x

16 terms
1.0 0.8 0.6

Eigenvalue Magnitude

104

10

100

=0.1 =0.2 =0.5

32 terms
1.0 0.8 0.6 y 0.4 0.2 0.0 0.0 0.2 0.4 0.6 0.8 1.0 y

64 terms
1.0 0.8 0.6 0.4 0.2 0.0 0.0 0.2 0.4 0.6 0.8 1.0

10-2 0

Eigenvalue #

20

40

60

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Other approaches

Domain decomposition approaches with projection Inference-based approaches for PCEs

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Sparse Quadrature Approaches for High-Dimensional Systems

Need for sparse quadrature Sparse quadrature grids Application to surface reaction example

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Sparse quadrature drastically reduces the number of function evaluations


Dene the precision as the highest order of a polynomial that is integrated exactly by the quadrature rule. Gaussian quadratures are optimal in 1d N points achieve the highest possible precision of 2N 1. In multi-d, full product quadrature is wasteful: a 5 ppd (point per dimension) rule is of precision P = 9, but it integrates a polynomial x 9 y 9 exactly. Sparse grids are built to achieve maximal precision with fewest possible points
Gauss-Hermite
Sparse, level 2, total 21 Full, ppd 7, total 49

Gauss-Legendre

Sparse, level 2, total 17 Full, ppd 5, total 25

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Clenshaw-Curtis nested quadrature rules allow reuse of function evalutions


Clenshaw-Curtis, level 1, total 5

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Clenshaw-Curtis nested quadrature rules allow reuse of function evalutions


Clenshaw-Curtis, level 2, total 13

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Clenshaw-Curtis nested quadrature rules allow reuse of function evalutions


Clenshaw-Curtis, level 3, total 29

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Clenshaw-Curtis nested quadrature rules allow reuse of function evalutions


Clenshaw-Curtis, level 4, total 65

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Clenshaw-Curtis nested quadrature rules allow reuse of function evalutions


Clenshaw-Curtis, level 5, total 145

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

The number of function evaluations is drastically reduced compared to full quadrature


Table: The number of Clenshaw-Curtis sparse grid quadrature points for various levels and dimensionalities.
Level L 1 2 3 4 5 6 7 8 Precision p = 2L 1 1 3 5 7 9 11 13 15 N (d = 2) 1 5 13 29 65 145 321 705 N (d = 5) 1 11 61 241 801 2433 6993 19313 N (d = 10) 1 21 221 1581 8801 41265 171425 652065 N (d ) General 1 1 + 2d 1 + 2d + 2d 2 1 + 14 d + 2d 2 + 4 d3 3 3

1 + 20 d + 22 d2 + 4 d3 + 2 d4 3 3 3 3

Number of CC sparse grid points, N

1e+06

1e+100

Number of CC full grid points, N

p=7
1e+05 10000 1000 100 10 1

p=7
1e+80

p=5

p=5
1e+60

p=3
1e+40

p=3

1e+20

10

20

30

40

50

60

70

80

90

100

Dimensionality, d

1 0

20

40

60

80

100

Sparse grid: polynomial growth, O (d

p1 2

Full grid: exponential growth, (p + 1)d


Forward UQ

Dimensionality, d

Debusschere SNL

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Surface Reaction Model


Species Mass Fractions [-]
1.0

3 ODEs for a monomer (u ), dimer (v ), and inert species (w ) adsorbing onto a surface out of gas phase. du = az cu 4duv dt dv = 2bz 2 4duv dt dw = ez fw dt z = 1uv w u (0) = v (0) = w (0) = 0.0

u
0.8 0.6 0.4 0.2 0.0 0 200 400

Time [-]

600

800

1000

Oscillatory behavior for b [20.2, 21.2] e = 0.36 f = 0.016

a = 1.6 b = 20.75 c = 0.04 d = 1.0

(Vigil et al., Phys. Rev. E., 1996; Makeev et al., J. Chem. Phys., 2002)

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Surface Reaction Model: 6d results

6
10

4
8
f e

101

PDF (uss)

6
d

0 -2 -4 -6 -6 -4 -2 0 2 4 6

102
c b a

103
a b c d e f

0 0

0.1

0.2

u ss

0.3

0.4

0.5

Output observable: time averaged u at steady state uss Assume all input parameters have Gaussian distributions with

/ = 0.01, i.e. 1% deviation.


6-d, level 3 Gauss-Hermite sparse quadrature point set includes 713

distinct points (a 2-d case is plotted)


Output PDF is generated by 100K samples of a third order PC Variance-based sensitivity information comes for free with the PC

expansion
Debusschere SNL Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Advantages and caveats of sparse quadrature approaches

Pro: number of required samples scales much more gracefully with

number of dimensions than full tensor product quadrature rule


Caveats: Function to be integrated needs to be smooth Due to negative quadrature weights, integrating a noisy positive function can give a negative answer For very high dimensions, even sparse quadrature is too expensive

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Taking Advantage of Sparsity in the System

For really high dimensional systems, even sparse quadrature requires

too many function evaluations


For 80-dimensional climate land model, L = 4 requires 106 points

Such systems can only be tackled with dimensionality reduction and/or

adaptive order
Sensitivity analysis High Dimensional Model Representation (HDMR) Adaptive sparse quadrature approaches

More generally, use only the basis terms needed to represent the

physics / information in the system / data


(Bayesian) Compressive Sensing (CS) approaches If information content is sparse, it can be represented at reasonable cost If not, you need to pay the price

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Challenges with Oscillatory / Long Time Horizon Systems

Issue with oscillations / long time horizon Importance of choice of observables

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Surface Reaction Model


Species Mass Fractions [-]
1.0

3 ODEs for a monomer (u ), dimer (v ), and inert species (w ) adsorbing onto a surface out of gas phase. du = az cu 4duv dt dv = 2bz 2 4duv dt dw = ez fw dt z = 1uv w u (0) = v (0) = w (0) = 0.0

u
0.8 0.6 0.4 0.2 0.0 0 200 400

Time [-]

600

800

1000

Oscillatory behavior for b [20.2, 21.2] e = 0.36 f = 0.016

a = 1.6 b = 20.75 c = 0.04 d = 1.0

(Vigil et al., Phys. Rev. E., 1996; Makeev et al., J. Chem. Phys., 2002)

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Limit Cycle Orbit in Phase Space, v vs u


0.12

0.1

0.08 v 0.06 0.04 0.02 0 0.1

0.15

0.2 u
Debusschere SNL

0.25

0.3

0.35

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

4th-order Intrusive PC UQ
0.5 0.4 0.3

Uncertain b WienerHermite PC Mean and 3 bounds

0.2 0.1 0.0 0.1 0 500 time 1000

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Growth of Phase Errors in Time

0.45 0.4 0.35 0.3 0.25 0.2 0.15 0.1 0.05 0 0 500 1000 1500 2000 time 2500

u u-mean-sampled u-mean-PC

3000

3500

4000

PC mean deviates away from the sampled-mean in time Large phase variances
Debusschere SNL Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Scatter in State Space

Much better behaved uncertainty in state space Phase variances are not of interest per se Knowledge of the uncertainty in the orbit details are more of interest than the detailed phase variance errors

0.12

0.1

0.08 v 0.06 0.04 0.02

0 0.1 0.15 0.2 u 0.25 0.3 0.35

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

PC vs. Sampling Limit Cycle Orbit in Phase Space, v vs u

0.2

PC samp

0.15

0.1

0.05

0 0.05 0.1 0.15 0.2 0.25 0.3 0.35 0.4

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Surface Reaction Model: Uncertainty in Time Average of u

0.4

0.3

PDF (uss)

u(t)

0.2

0.1

b=19.4 b=19.8 b=20.2 b=20.6 b=21.0 b=21.4 b=21.8 b=22.2 b=22.6


1000 2000 3000 4000 5000

0 0

0 0

0.1

0.2

Time, t

u ss

0.3

0.4

0.5

Variation of b leads to different qualitative behaviors Output observable: average over time of u at steady state: uss Representative, e.g. of expected coverage in catalytic system Varied the parameter b by 10% around its nominal value Output PDF is generated by 100K samples of a 9-th order PC

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Global PC expansions fail to capture multimodality


5

Monte-Carlo with 10000 samples (true PDF) 10-th order PC built with 12 samples
4

PDF (u ss)

0.1

0.2

uss

0.3

0.4

0.5

In principle, PC-based uncertainty propagation requires much fewer

function evaluation.
However, the accuracy of the PC expansion needs to be properly

estimated.
E.g., multimodal variables are not well-captured, even with a high PC

order.
Debusschere SNL Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Domain Decomposition Approaches to Handle Nonlinearities

A PCE is essentially a polynomial approximation of a random variable as

a function in stochastic space


Global PCEs fail to represent very non-linear functions over large

domains
E.g. Gibbs oscillations around discontinuities As order increases, more oscillations

Piecewise representations can alleviate this Lower order approximations over subsections of stochastic domain Continuity of representation across subdomain boundaries is not required
Boundary is area of zero measure Allows easy representation of discontinuities

Application: intrusive UQ in thermal ignition model

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

UQ in constant-pressure ignition

M reactions in N species, with mass fractions Yi : dYi dt dT dt with wT =


N i =1

= =

wi , wT cp
M k =1

i = 1, , N

hi wi and wi =

ik Rk

Example: Stoichiometric coefcients: Reaction rate of progress: factor Fk : P

CH4 + 2O2 CO2 + 2H2 O ik = {1, 2, 1, 2} Rk = [CH4 ][O2 ]2 Ak T nk eEk /T

Quantify reaction-rate pre-exponential (Ak ) uncertainty with multiplicative

Ak < Ak < Fk Ak Fk

= 0.95

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Large activation energy (Ek ) exponentials lead to very fast changes in species concentrations and temperature

3000 E=050K 2500 Mean Temperature (K)

Methane-air ignition

Global single-step irreversible mechanism


Initial T = 800K Stoichiometric p = 1 atm (constant)

2000

1500

1000

500 16 14 12 10 8 6 4 2 0 2 10 10 10 10 10 10 10 10 10 10 time (sec)

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Increased Ek leads to higher peak dT /dt and higher consequence of small uncertainties in reaction rate constants
E=20K 0.040 Temperature Standard Deviation (K) Temperature Standard Deviation (K) 4 order WH PC 1000 Samples 0.030
th

E=40K 15 4 order WH PC 1000 Samples 10


th

0.020

0.010

0.000 10 10

10

10 time (sec)

10

10

0 0.212

0.214 0.216 time (msec)

0.218

Ak = Ak ( ), 1-D Wiener-Hermite PC UQ captures sampled stochastic behavior at low Ek with miniscule uncertainty in Ak (F = 1.00002, COV=105 ). Unphysical effects observed at high activation energy

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Errors increase for realistic Ek


E=50K 300 Temperature Standard Deviation (K) 250 200 150 100 50 0 4 order WH PC 1000 Samples th 5 order WH PC th 6 order WH PC
th

Ek = 50 K Fk = 1.00002 Max T -stdv

300K
Non-zero

uncertainty in T for t
Increased PC

order does not resolve the problem

86.56

86.58 time (msec)

86.60

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Increasing Ak COV towards minimally-practical levels leads to failed time integration


E=50K 500 Temperature Standard Deviation (K) F=1.00002 F=1.00003 F=1.00004
Temperature (K) 3000 Tmax Tav 2000 1000 samples, F=2.0

400

2500

300

200

1500

100

1000

Tmin

86.56

86.58 time (msec)

86.60

500 0.00

0.05

0.10 time (sec)

0.15

0.20

Unrealistic to expect a WH PC expansion in 1D to capture

expected PDFs at realistic reaction-rate parametric uncertainties


Need increased dimensionality of the PCEs, using multiple s for each

uncertain parameter, for increased accuracy and stability

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Uncertainty Quantication with Multiwavelets

An uncertain eld quantity u (x, t , ) is expressed using PC


P

=
k =0

uk (x, t )k (1 , . . . , N )

Introduce i = p(i ): CDF of i , where i is on [0,1] u = g (1 , . . . , N ) = f (1 , . . . , N )

Represent f ( ) using N -D multiwavelets (Alpert, 1993)


Q

u [Le Matre et al., 2004]

=
=0

(x, t ) W (1 , . . . , N ) u

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Haar-Wavelets

Haar scaling functions w (y ) = 1, 0 y < 1 0 otherwise

Scaled Haar functions, scaling factor j , and sliding factor k :


j /2 w w (2j y k ) jk (y ) = 2

Haar function (mother wavelet) 1, 1 1 w w (y ) w 1, 1,0 (y ) 1,1 (y ) = 2 2 0, Wavelet family


j /2 w jw (2j y k ), ,k (y ) = 2

, 0y < 1 2 1 y < 1 , 2 otherwise.

j = 0, 1, . . . and k = 0, . . . , 2j 1

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Wiener-Haar Construction

The set of jw ,k (y ) is an orthonormal system Any function f L2 ([0, 1]) can be arbitrarily well approximated by the sum of its mean and a nite linear combination of the jw ,k (y ).
2 The wavelet set Wj ,k ( ()) jw ,k (p ( )) forms a basis for the space of L random processes. 2j 1

X ( ()) = Xo +
j =0 k =0

w Xjw ,k j ,k (p ( )) =

X W ( ())

Multidimensional {1 , 2 , . . . , N },
N

W=
k =1

Wk (k )

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Multidimensional Multiwavelet Construction

Wiener-Haar PC able to represent uncertainty in systems exhibiting bifurcations depending on parameter values Poor convergence relative to PC constructions with smooth global bases on smooth functions Use multiwavelet construction (Alpert, 1993) employing higher order polynomials instead of the Haar-functions For efcient multidimensional construction, use Block-decomposition of the stochastic space A local MW construction on each block employing Scaled Legendre polynomials on [0, 1] First level Multiwavelet details Adaptive resolution in each dimension on each block

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Multi-Wavelet Representation of Model Problem Gaussian IC


No = 0
1 0.8 0.6 0.4 U 0.2 0 -0.2 -0.4 -0.6 0 0.25 0.5 CDF 0.75 1 Nr=1 Nr=2 Nr=3 Nr=4 Nr=5 Nr=6 Exact 1 0.8 0.6 0.4 U 0.2 0 -0.2 -0.4 -0.6 0 0.25 0.5 CDF 0.75 1 Nr=1 Nr=2 Nr=3 Nr=4 Nr=5 Nr=6 Exact

No = 1
IC U0 = 0.2,

U1 = 0.1
No : MW order Nr : MRA

resolution
Local

No = 2
1 0.8 0.6 0.4 U 0.2 0 -0.2 -0.4 -0.6 0 0.25 0.5 CDF 0.75 1 Nr=1 Nr=2 Nr=3 Nr=4 Nr=5 Nr=6 Exact 1 0.8 0.6 0.4 U 0.2 0 -0.2 -0.4 -0.6 0 Nr=1 Nr=2 Nr=3 Nr=4 Nr=5 Nr=6 Exact

No = 3

representation using low-order polynomial Multi-Wavelets


Increased No

leads to faster convergence to exact solution with increasing Nr


0.5 CDF 0.75 1

0.25

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Adaptive Partition of the Random Parameter Space

For multiple stochastic dimensions, the computational cost of the MW

spectral products can become prohibitive


Use adaptive partitioning of the space of random parameters On each sub-domain dene a local scaled random basis Construct a local MW expansion with up-to only 1st -level details Local Wiener-Legendre projection of order No , plus 1D details Combine local block-statistics to arrive at global statistics Adaptively rene block decomposition in each stoch dimension Use 1D details variance to guide renement in each dimension
Le Matre et al. J. Comp. Phys., 197:502-531 (2004)

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Block decomposition in a 2D parametric space

1 0.8 0.6 x8 0.4 0.2 0 0 0.2 0.4 x7


Local PC expansion in each subdomain block Renement in directions where needed

0.6

0.8

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Computational Advantages with MRA UQ

Adaptive partitioning allows optimal combinations of resolution and order over space-time UQ problem can be done separately on each sub-domain This can be done intrusively or non-intrusively on each sub-domain Presents advantages in a massively parallel context

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Generate ignition data" using a detailed model+noise

Ignition using a detailed

chemical model for methane-air chemistry Temperature


Multiplicative noise error model 11 data points:
Ignition time (sec)

1 GRI

Ignition time versus Initial

GRI+noise 0.1

di

GRI tig , i (1 + i )

N (0, 1)
0.01 1000 1100 1200 Initial Temperature (K) 1300

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Fitting with a simple chemical model


36 34 lnA lnE 10.6 -0.5 -1 -1.5 -2 -2.5 -3 0 ln 32 30 28

Fit a global single-step

irreversible chemical model CH4 + 2O2 CO2 + 2H2 O R = [CH4 ][O2 ]kf kf = A exp(E /R o T )

10.8

Infer 3-D parameter vector

(ln A, ln E , ln )
Good mixing with adaptive

MCMC when start at MLE

2000

4000 6000 Chain Step

8000

10000

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Bayesian Inference Posterior and Nominal Prediction

10.85

1
10.8

GRI Ignition time (sec)

GRI GRI+noise Fit Model

10.75

GRI+noise 0.1

10.7

10.65

10.6

30

31

32

33

34

35

0.01 1000

1100 1200 Initial Temperature (K)

1300

Marginal joint posterior on (ln A, ln E ) exhibits strong correlation

Nominal t model is consistent with the true model

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Correlation Slope and Chemical Ignition

Means
3000 T 0.25 0.2 0.15 0.1 CH4 0.05 0 0 1000 1 CO2 H2O 1500 2000

Standard Deviations
0.06 T 0.05
2500 Temperature (K)

400 CH4 O2 CO2 H2O

O2

300 Temperature (K)

Mass Fraction

Mass Fraction

0.04 0.03 0.02

200

100 0.01 0 0.46 0 0.47

0.5 Time (sec)

0.462

0.464 0.466 Time (sec)

0.468

4th Order Multiwavelet PC, Multiblock, Adaptive T ,max 400 K during ignition transient, 0.03

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Time evolution of Temperature PDFs in preheat stage

MC 0.15 t=0.455 sec 0.15 t=0.455 sec

MW

Probability Density

0.1

Probability Density

0.459 sec

0.1

0.459 sec

0.05

0.462 sec 0.464 sec

0.05

0.462 sec 0.464 sec

0 1300

1400 1500 Temperature (K)

1600

0 1300

1400 1500 Temperature (K)

1600

Similar results from MC (20K samples) and MW PC Increased uncertainty, and long highT PDF tails, in time

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Evolution of Temp. PDF Fast Ignition Transient

MC 0.02 0.01 0 0.02 0.01 0 0.02 0.01 0 0.02 0.01 Density 0 0.02 0.01 0 1000 1500 0.4668 sec Density t = 0.4642 sec 0.02 0.01 0 0.02 0.01 0 0.02 0.01 0 0.02 0.01 0 0.02 0.01 0 1000 1500

MW t = 0.4642 sec

0.4660 sec

0.4660 sec

0.4664 sec

0.4664 sec

0.4668 sec

0.4671 sec 2000 2500 Temperature (K) 3000

0.4671 sec 2000 2500 Temperature (K) 3000

Transition from unimodal to bimodal PDFs Leakage of probability mass from pre-heat PDF highT tail

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Time evolution of Temperature PDFs for different


0.022875
0.01 0.008 Probability Density Probability Density

0.006 0.005

0.036675
Probability Density

0.03 0.025

0.042425
Probability Density

0.06 0.05 0.04 0.03 0.02 0.01 0

0.04325

0.004 0.003 0.002 0.001 0

0.02 0.015 0.01 0.005 0

0.006 0.004 0.002 0

1500

2000 2500 Temperature (K)

3000

1500

2000 2500 Temperature (K)

3000

1500

2000 2500 Temperature (K)

3000

1500

2000 2500 Temperature (K)

3000

0.04395
0.25 Probability Density
Probability Density

0.08 0.07 0.06

0.04475
0.01 0.008 0.006 0.004 0.002 0

0.052475
Probability Density Probability Density

0.006 0.005 0.004 0.003 0.002 0.001 0

0.066275

0.2 0.15 0.1 0.05 0

0.05 0.04 0.03 0.02 0.01

1500

2000 2500 Temperature (K)

3000

1500

2000 2500 Temperature (K)

3000

1500

2000 2500 Temperature (K)

3000

1500

2000 2500 Temperature (K)

3000

Bimodal solution PDFs for high uncertainty growth Unimodal for low uncertainty growth, with 0.044

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Data Decomposition Approaches to Handle Multi-Modalities

Alternative approach when faced with samples of multi-modal random

variables
Separate data into multiple sets that are easier to represent Represent each with global PCE Overall results is a PC mixture model Generalization of Gaussian mixture models

Application: Karhunen-Love applied to output state of Schlgl model

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Schlgl Model is a prototype bistable model


400

X(t)

Reactions A + 2X B

a1 a2 a3 a4

3X X .

300 200 100 0 800 600

!!"!#$%

10

15

20

Propensities a1 = k1 AX (X 1)/2, a2 = k2 X (X 1)(X 2)/6, a3 = k3 B , a4 = k4 X . Nominal parameters k1 A 0.03 k2 0.0001 k3 B = 200 k4 3.5 A 105 B 2 105 X (0) 250

X(t)

400 200 0 800 700

!!"!&%%
0 5 10 15 20

X(t)

600 500 400 300 200 0 5 10

!!"!$'%
15 20

Time, t

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Polynomial Chaos expansion represents any random variable as a polynomial of a standard random variable
Truncated PCE: nite dimension n and order p
P

X ( )
k =0

ck k ( )
(n+p)! . n!p!

with the number of terms P + 1 =

= (1 , , n ) standard i.i.d. r.v. k standard orthogonal polynomials ck spectral modes. Most common standard Polynomial-Variable pairs: (continuous) Gauss-Hermite, Legendre-Uniform, (discrete) Poisson-Charlier.
[Wiener, 1938; Ghanem & Spanos, 1991; Xiu & Karniadakis, 2002]

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Galerkin Projection is typically needed


PC expansion: X ( )
P k =0

ck k ( ) = gD ( )

Orthogonal projection: ck =

X ( )k ( ) 2 ( ) k

Intrusive Spectral Projection (ISP) Direct projection of governing equations Leads to deterministic equations for PC coefcients No explicit governing equation for SRNs Non-intrusive Spectral Projection (NISP) Sampling based No explicit evolution equation for X needed Galerkin projection not well-dened for SRNs

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Karhunen-Love decomposition reduces stochastic process to a nite number of random variables


KL decomposition:

(t ) + X (t , ) = X
n =1

n () n fn (t )

Uncorrelated, zero-mean KL variables: n = 0, SSA(continuum) KL(discrete) X (t ) = (1 , 2 , . . . )


100 1e+06 2

n m = nm

fi(t)

Eigenvalues, !n

1/2

1e+05

KL modes, !i

-100

"
10000 -2 1000 -4 0 1 2 3 4 7 8 9 -2

-200

-300

10

12

14

16

18

20

Time, t

-1

!

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

K-L decomposition captures each realization

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

K-L decomposition captures each realization

700 600 500 400 300 200 100 0 0 2 4 6 8 10 12 14 16 18 20

L=10 L=30 L=60 X(t)

X(t)

Time, t

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

PC expansion of a random vector


P

=
k =0

c k k ( )

Galerkin projection ck = k ( ) 2 k ( )

is not well-dened, since and do not belong to the same stochastic space.

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

PC expansion of a random vector


P

=
k =0

c k k ( )

Galerkin projection ck = k ( ) 2 k ( )

is not well-dened, since and do not belong to the same stochastic space.

Need a map .

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Rosenblatt transformation
Rosenblatt transformation maps any (not necessarily independent) set of random variables (1 , . . . , n ) to uniform i.i.d.s {i }n i =1 (Rosenblatt, 1952). 1 2 3 . . . n = Fn|n1,...,1 (n |n1 , . . . , 1 ) = = = F1 (1 ) F2|1 (2 |1 ) F3|2,1 (3 |2 , 1 )

Inverse Rosenblatt transformation = R 1 ( ) ensures a well-dened quadrature integration i k ( ) = R 1 ( )i k ( )d

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

KL+PC+Data Partitioning represent the dynamics of a bimodal process

KL-PC representation, 5 KL modes, 3rd PC order


700

600

500

XKLPC(t)

400

300

200

100

10

15

20

Time, t

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

UQ in Systems with Inherent Stochastic Noise

Some systems have intrinsic uncertainty Stochastic reaction networks Macroscale quantities extracted from atomistic methods with sampling Galerkin projection is challenged by such systems No deterministic governing equation for intrusive UQ Quadrature methods, especially sparse methods, are challenged by noise in function evaluations Use Bayesian regression to infer PC coefcients

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Summary

Polynomial Chaos Expansions offer a convenient way to represent

random variables
Both intrusive and non-intrusive approaches are available to propagate

uncertain model inputs to its outputs


While conceptually straightforward, many challenges remain in terms of

efciency and accuracy

Debusschere SNL

Forward UQ

Introduction

PCES

Forward UQ

Sensitivity

Challenges

Summary

References

Further Reading

N. Wiener, Homogeneous Chaos", American Journal of Mathematics, 60:4, pp. 897-936, 1938. M. Rosenblatt, Remarks on a Multivariate Transformation", Ann. Math. Statist., 23:3, pp. 470-472, 1952. R. Ghanem and P. Spanos, Stochastic Finite Elements: a Spectral Approach", Springer, 1991. O. Le Matre and O. Knio, Spectral Methods for Uncertainty Quantication with Applications to Computational Fluid Dynamics", Springer, 2010. D. Xiu, Numerical Methods for Stochastic Computations: A Spectral Method Approach", Princeton U. Press, 2010. O.G. Ernst, A. Mugler, H.-J. Starkloff, and E. Ullmann, On the convergence of generalized polynomial chaos expansions, ESAIM: M2AN, 46:2, pp. 317-339, 2011. D. Xiu and G.E. Karniadakis, The Wiener-Askey Polynomial Chaos for Stochastic Differential Equations", SIAM J. Sci. Comput., 24:2, 2002. Le Matre, Ghanem, Knio, and Najm, J. Comp. Phys., 197:28-57 (2004) Le Matre, Najm, Ghanem, and Knio, J. Comp. Phys., 197:502-531 (2004) B. Debusschere, H. Najm, P. Pbay, O. Knio, R. Ghanem and O. Le Matre, Numerical Challenges in the Use of Polynomial Chaos Representations for Stochastic Proecesses", SIAM J. Sci. Comp., 26:2, 2004. S. Ji, Y. Xue and L. Carin, Bayesian Compressive Sensing", IEEE Trans. Signal Proc., 56:6, 2008. K. Sargsyan, B. Debusschere, H. Najm and O. Le Matre, Spectral representation and reduced order modeling of the dynamics of stochastic reaction networks via adaptive data partitioning". SIAM J. Sci. Comp., 31:6, 2010. K. Sargsyan, B. Debusschere, H. Najm and Y. Marzouk, Bayesian inference of spectral expansions for predictability assessment in stochastic reaction networks". J. Comp. Theor. Nanosc., 6:10, 2009.

Debusschere SNL

Forward UQ

You might also like