Short Course |
Instructor |
Schedule |
Materials |
Modern Response Surface Methods & Computer Experiments | Dr. Robert B. Gramacy Virginia Tech |
Tuesday, March 20th 0900 – 1730 |
![]() |
Show Abstract This course details statistical techniques at the interface between mathematical modeling via computer simulation, computer model meta-modeling (i.e., emulation/surrogate modeling), calibration of computer models to data from field experiments, and model-based sequential design and optimization under uncertainty (a.k.a. Bayesian Optimization). The treatment will include some of the historical methodology in the literature, and canonical examples, but will primarily concentrate on modern statistical methods, computation and implementation, as well as modern application/data type and size. The course will return at several junctures to real-word experiments coming from the physical and engineering sciences, such as studying the aeronautical dynamics of a rocket booster re-entering the atmosphere; modeling the drag on satellites in orbit; designing a hydrological remediation scheme for water sources threatened by underground contaminants; studying the formation of super-nova via radiative shock hydrodynamics. The course material will emphasize deriving and implementing methods over proving theoretical properties. |
|||
Overview of Design of Experiments | Dr. Doug Montgomery Arizona State University & Dr. Brad Jones JMP |
Tuesday, March 20th 0900 – 1730 |
![]() ![]() ![]() |
Show Abstract
Well-designed experiments are a powerful tool for developing and validating cause and effect relationships when evaluating and improving product and process performance and for operational testing of complex systems. Designed experiments are the only efficient way to verify the impact of changes in product or process factors on actual performance.The course outcomes are: • Ability to plan and execute experiments • Ability to collect data and analyze and interpret these data to provide the knowledge required for business success • Knowledge of a wide range of modern experimental tools that enable practitioners to customize their experiment to meet practical resource constraintsThe topics covered during the course are: • Fundamentals of DOX – randomization, replication, and blocking. • Planning for a designed experiment – type and size of design, factor selection, levels and ranges, response measurement, sample sizes. • Graphical and statistical approaches to DOX analysis. • Blocking to eliminate the impact of nuisance factors on experimental results. • Factorial experiments and interactions. • Fractional factorials – efficient and effective use of experimental resources. • Optimal designs • Response surface methods • A demonstration illustrating and comparing the effectiveness of different experimental design strategies. This course is focused on helping you and your organization make the most effective utilization of DOX. Software usage is fully integrated into the course. |
|||
Introduction to R | Dr. Justin Post NCSU |
Tuesday, March 20th 0900 – 1730 |
![]() ![]() |
Show Abstract This course is designed to introduce participants to the R programming language and the R studio editor. R is a free and open-source software for summarizing data, creating visuals of data, and conducting statistical analyses. R can offer many advantages over programs such as Excel including faster computation, customized analyses, access to the latest statistical techniques, automation of tasks, and the ability to easily reproduce research. After completing this course, a new user should be able to:
|
|||
Survey Construction and Analysis | Dr. Wendy Martinez & Dr. MoonJung Cho Bureau of Labor and Statistics |
Tuesday, March 20th 0900 – 1730 |
![]() |
Show Abstract
In this course, we introduce the main concepts of the survey methodology process – from survey sampling design to analyzing the data obtained from complex survey designs. The course topics include
We use a combination of lectures and hands-on exercises using R. Students are expected to have R and associated packages installed on their computers. We will send a list of required packages before the course. We also use data from Department of Defense surveys, where appropriate. |
|||
Uncertainty Quantification | Dr. Ralph Smith NCSU |
Tuesday, March 20th 0900 – 1730 |
![]() ![]() ![]() ![]() |
Show Abstract We increasingly rely on mathematical and statistical models to predict phenomena ranging from nuclear power plant design to profits made in financial markets. When assessing the feasibility of these predictions, it is critical to quantify uncertainties associated with the models, inputs to the models, and data used to calibrate the models. The synthesis of statistical and mathematical techniques, which can be used to quantify input and response uncertainties for simulation codes that can take hours to days to run, comprises the evolving field of uncertainty quantification. The use of data, to improve the predictive accuracy of models, is central to uncertainty quantification so we will begin by providing an overview of how Bayesian techniques can be used to construct distributions for model inputs. We will subsequently describe the computational issues associated with propagating these distributions through complex models to construct prediction intervals for statistical quantities of interest such as expected profits or maximal reactor temperatures. Finally, we will describe the use of sensitivity analysis to isolate critical model inputs and surrogate model construction for simulation codes that are too complex for direct statistical analysis. All topics will be motivated by examples arising in engineering, biology, and economics. |