The@ first graduate-level textbook to focus on fundamental aspects of numerical methods for stochastic computations, this book describes the class of numerical methods based on generalized polynomial chaos (gPC). These fast, efficient, and accurate methods are an extension of the classical spectral methods of high-dimensional random spaces. Designed to simulate complex systems subject to random inputs, these methods are widely used in many areas of computer science and engineering.
The book introduces polynomial approximation theory and probability theory; describes the basic theory of gPC methods through numerical examples and rigorous development; details the procedure for converting stochastic equations into deterministic ones; using both the Galerkin and collocation approaches; and discusses the distinct differences and challenges arising from high-dimensional problems. The last section is devoted to the application of gPC methods to critical areas such as inverse problems and data assimilation.
Ideal for use by graduate students and researchers both in the classroom and for self-study, Numerical Methods for Stochastic Computations provides the required tools for in-depth research related to stochastic computations.
"[A]s a newbie to this field, by reading this lively written text I was able to gain insight into this really interesting and challenging matter."--Peter Mathé, Mathematical Reviews Chapter 2: Basic Concepts of Probability Theory 9 Chapter 3: Survey of Orthogonal Polynomials and Approximation Theory 25 Chapter 4: Formulation of Stochastic Systems 44 Chapter 5: Generalized Polynomial Chaos 57 Chapter 6: Stochastic Galerkin Method 68 Chapter 7: Stochastic Collocation Method 78 Chapter 8: Miscellaneous Topics and Applications 89 Appendix A: Some Important Orthogonal Polynomials in the Askey Scheme 105 Appendix B: The Truncated Gaussian Model G(a?, ?ß) 113
Chapter 1: Introduction 1
1.1 Stochastic Modeling and Uncertainty Quantification 1
1.1.1 Burgers' Equation: An Illustrative Example 1
1.1.2 Overview of Techniques 3
1.1.3 Burgers' Equation Revisited 4
1.2 Scope and Audience 5
1.3 A Short Review of the Literature 6
2.1 Random Variables 9
2.2 Probability and Distribution 10
2.2.1 Discrete Distribution 11
2.2.2 Continuous Distribution 12
2.2.3 Expectations and Moments 13
2.2.4 Moment-Generating Function 14
2.2.5 Random Number Generation 15
2.3 Random Vectors 16
2.4 Dependence and Conditional Expectation 18
2.5 Stochastic Processes 20
2.6 Modes of Convergence 22
2.7 Central Limit Theorem 23
3.1 Orthogonal Polynomials 25
3.1.1 Orthogonality Relations 25
3.1.2 Three-Term Recurrence Relation 26
3.1.3 Hypergeometric Series and the Askey Scheme 27
3.1.4 Examples of Orthogonal Polynomials 28
3.2 Fundamental Results of Polynomial Approximation 30
3.3 Polynomial Projection 31
3.3.1 Orthogonal Projection 31
3.3.2 Spectral Convergence 33
3.3.3 Gibbs Phenomenon 35
3.4 Polynomial Interpolation 36
3.4.1 Existence 37
3.4.2 Interpolation Error 38
3.5 Zeros of Orthogonal Polynomials and Quadrature 39
3.6 Discrete Projection 41
4.1 Input Parameterization: Random Parameters 44
4.1.1 Gaussian Parameters 45
4.1.2 Non-Gaussian Parameters 46
4.2 Input Parameterization: Random Processes and Dimension Reduction 47
4.2.1 Karhunen-Loeve Expansion 47
4.2.2 Gaussian Processes 50
4.2.3 Non-Gaussian Processes 50
4.3 Formulation of Stochastic Systems 51
4.4 Traditional Numerical Methods 52
4.4.1 Monte Carlo Sampling 53
4.4.2 Moment Equation Approach 54
4.4.3 Perturbation Method 55
5.1 Definition in Single Random Variables 57
5.1.1 Strong Approximation 58
5.1.2 Weak Approximation 60
5.2 Definition in Multiple Random Variables 64
5.3 Statistics 67
6.1 General Procedure 68
6.2 Ordinary Differential Equations 69
6.3 Hyperbolic Equations 71
6.4 Diffusion Equations 74
6.5 Nonlinear Problems 76
7.1 Definition and General Procedure 78
7.2 Interpolation Approach 79
7.2.1 Tensor Product Collocation 81
7.2.2 Sparse Grid Collocation 82
7.3 Discrete Projection: Pseudospectral Approach 83
7.3.1 Structured Nodes: Tensor and Sparse Tensor Constructions 85
7.3.2 Nonstructured Nodes: Cubature 86
7.4 Discussion: Galerkin versus Collocation 87
8.1 Random Domain Problem 89
8.2 Bayesian Inverse Approach for Parameter Estimation 95
8.3 Data Assimilation by the Ensemble Kalman Filter 99
8.3.1 The Kalman Filter and the Ensemble Kalman Filter 100
8.3.2 Error Bound of the EnKF 101
8.3.3 Improved EnKF via gPC Methods 102
A.1 Continuous Polynomials 106
A.2 Discrete Polynomials 108
References 117
Index 127