AMCS 308 - Stochastic Numerics with Application in Simulation and Data Science by Prof. Raul Tempone
Review of basic probability; Monte Carlo simulation; state-space models and time series; parameter estimation, prediction, and filtering; Markov chains and processes; stochastic control; Markov chain Monte Carlo. Examples from various engineering disciplines.
Overview
Spring Semesters
COURSE OBJECTIVES: The student who follows this course will get acquainted with stochastic computational tools and their mathematical foundations, used to analyze systems with uncertainty arising in engineering, physics, chemistry, and economics. By the end of the course, the student must be able to:
Analyze the convergence and computational work of sampling algorithms
Implement sampling methods for different stochastic processes
- Propose efficient sampling methods for different stochastic problems
COURSE OUTLINE:
Probability Basics
- Probability refresher
- Conditional Expectation
- Bayesian Statistics
The Monte Carlo Method
- Sampling of random variables
- The Central Limit Theorem and related results
- Convergence rates and confidence intervals
- Variance reduction techniques
- Large deviations and Rare events simulations
- Kernel density estimators
- Chernoff inequality and large deviations
- Resampling techniques
Linear and Gaussian Models
Discrete Time Markov Chains
Bayesian Filters, Kalman Filters and state estimation
Continuous-Time Markov Chains
- Pure jump processes
- SSA and Tau leap
Markov Chain Monte Carlo
Other topics that may be addressed if time allows: Continuous time and space Markov Chains, Stochastic optimization, Stochastic Optimal Control, Optimal Experimental Design, Model Selection and Model Validation, Machine Learning.
---
References
S. Asmussen and P. W. Glynn. Stochastic Simulation: Algorithms and Analysis. Springer, 2007. A broad treatment of sampling-based computational methods with mathematical analysis of the convergence properties of the methods discussed.
D. Gamerman and H. F. Lopes. Markov Chain Monte Carlo. Chapman & Hall/CRC, second edition, 2006.
P. W. Glynn. Stochastic Methods in Engineering. Stanford University. Lecture Notes for the corresponding course at Stanford.
J. Kaipio and E. Somersalo. Statistical and Computational Inverse Problems. Springer, 2005.
J. R. Norris. Markov Chains. Cambridge University Press, 1997. Discrete-time and continuous- time Markov chains with applications to simulation in Chemistry and Physics, economics, opti- mal control, genetics, queues and many other topics.
D. S. Sivia and J. Skilling. Data Analysis, a Bayesian Tutorial. Oxford University Press, second edition, 2006. Basic principles of Bayesian probability theory – their use illustrated with exam- ples ranging from elementary parameter estimation to image processing. Other topics covered include reliability analysis, multivariate optimization, least-squares and maximum likelihood, error-propagation, hypothesis testing, maximum entropy and experimental design.
J. Voss. An Introduction to Statistical Computing. Wiley, 2014.
Related link: