Midwest Mathematics and Climate Conference - Day 2 Morning Session

View Link (HTM)

Licensed according to this deed.

Published on

Abstract

  • Juan Restrepo, Oregon State University
    Data Assimilation
    • Accounting for uncertainties has led us to alter our expectations of what is predictable and how such predictions compare to nature. A significant effort, in recent years, has been placed on creating new uncertainty quantification techniques, rediscovering old ones, and the appropriation of existing ones to account for uncertainties in modeling and simulations. Is this nothing more than a greater reliance on statistics techniques in our regular business? Some of it is. However, as this presentation will recount and illustrate, there are important changes on how we perform the business of modeling and predicting natural phenomena: Bayesian inference is used to combine models and data (not just to compare models and data); sensitivity analyses and projection techniques influence mean-field modeling; data classification techniques allow us to work with the more general state variables, which subsume dynamic physical variables; we exploit complex stochastic representations to better capture multi- scale phenomena or to capture the small-scale correlations of big data sets. This presentation will be highly accessible to non-specialists and students.

  • Matthias Morzfeld, University of California, Berkeley
    Conditions for Successful Data Assimilation

    • In data assimilation one updates a (stochastic) mathematical model with information from sparse and noisy data. In theory, the solution of the data assimilation problem is clear: a posterior probability density contains all the information one has given the model and data. The practical use of data assimilation however requires answer to new questions. For example, what conclusions can one can draw from the posterior if the noises in the model or data are large? When is it practical to compute the posterior in high dimensions? I will address these questions with an emphasis on how the data assimilation problem and its solution (the posterior) scale with the dimension of the problem. The analysis reveals the conditions under which a data assimilation problem is feasible in principle. I will also discuss which of the feasible problems can be solved efficiently (and in high-dimensions) with particle filters and ensemble Kalman filters.

  • Laura Slivinski, Woods Hole Oceanographic Institution
    Extracting the Most from Drifter Trajectories: A Method for Lagrangian Data Assimilation

    • Measurements from passive Lagrangian ocean drifters provide a growing source of data in our oceans. These measurements are often treated as coming from a sequence of fixed positions, and the information that this data came from a continuous trajectory is often lost. Lagrangian data assimilation seeks to make the most of this information by assimilating the positions of passive drifters in an attempt to estimate the velocity field. However, these trajectories are often highly nonlinear, leading to difficulties with widely-used data assimilation algorithms. Additionally, the velocity field is often modeled as a high-dimensional variable, which precludes the use of more accurate methods. We have developed a hybrid data assimilation method that exploits the advantages of two well-known data assimilation methods. This hybrid method has been successfully tested on both low- and high-dimensional systems.

  • Sebastian Reich, University of Potsdam, Germany, and University of Reading, United Kingdom
    Assimilating Data into Scientific Models: A Coupling of Measures Perspective

    • Reliable forecasting requires the combination of scientific modeling with available data. When dynamical phenomena are to be forecast, this requirement leads to sequential data assimilation problems which are best tackled from a Bayesian perspective. Bayes' formula provides the centerpiece for Bayesian data assimilation and Bayesian learning in general. However, beyond its conceptional simplicity and beauty, Bayes' formula is hardly ever directly applicable and this is true in particular when Bayes' formula needs to be interfaced with complex scientific models. In this context it is better to talk of simulating Bayes' formula. Bayes' formula has been implemented in the setting of sequential Monte Carlo methods and general Markov chain Monte Carlo methods. However, those methods suffer from the curse of dimensionality. In my talk, I will start approaching Bayes' formula from an entirely different perspective; namely that of coupling probability measures and optimal transportation. This approach (i) naturally puts the popular ensemble Kalman filters into context and suggests natural extensions to non-Gaussian data assimilation problems, (ii) allows for the implementation of sequential Monte Carlo methods in high dimensions using the concept of localization, and (iii) can be combined with quasi-Monte Carlo sampling approaches.

Submitter

Colin James Grudzien

University of North Carolina at Chapel Hill Mathematics

Tags