Implicitly localized ensemble observational update to cope with nonlocal/nonlinear data constraints in large-size inverse problems

This is a Preprint and has not been peer reviewed. This is version 1 of this Preprint.

Add a Comment

You must log in to post a comment.


Comments

There are no comments or no comments have been made public for this article.

Downloads

Download Preprint

Authors

Jean-Michel Brankart

Abstract

Many practical applications involve the resolution of large-size inverse problems, without providing more than a moderate-size sample to describe the prior probability distribution. In this situation, additional information must be supplied to augment the effective dimension of the available sample. This is the role played by covariance localization in the large-size applications of the ensemble Kalman filter. In this paper, it is suggested that covariance localization can also be efficiently applied to an approximate variant of the Metropolis/Hastings algorithm, by modulating the ensemble members by the large-scale patterns of other members. Modulation is used to design a (global) proposal probability distribution (i) that can be sampled at a very low cost (proportional to the size of the state vector, with a small integer coefficient), (ii) that automatically accounts for a localized prior covariance, and (iii) that leads to an efficient sampler for the augmented prior probability distribution or for the posterior probability distribution. The resulting algorithm is applied to an academic example, illustrating (i) the effectiveness of covariance localization, (ii) the ability of the method to deal with nonlocal/nonlinear observation operators and non-Gaussian observation errors, (iii) the possibility to deal with non-Gaussian (even discrete) prior marginal distributions, by including (stochastic) anamorphosis transformations, (iv) the reliability, resolution and optimality of the updated ensemble, using probabilistic scores appropriate to a non-Gaussian posterior distribution, and (v) the scalability of the algorithm as a function of the size of the problem. The evaluation of the computational complexity of the algorithm suggests that it could become numerically competitive with local ensemble Kalman filters, even in the absence of nonlocal constraints, especially if the localization radius is large. All codes necessary to reproduce the example application described in this paper are openly available from github.com/brankart/ensdam.

DOI

https://doi.org/10.31223/osf.io/7mcwz

Subjects

Applied Mathematics, Earth Sciences, Physical Sciences and Mathematics

Keywords

data assimilation, ensemble, inverse method, localization, MCMC algorithm, nonlinear constraints

Dates

Published: 2019-05-30 20:08

License

GNU Lesser General Public License (LGPL) 2.1