Bayesian methods are a way of obtaining information on quantities which are not directly observable. In the context of scientific mathematical/computational predictions of some physical system this can be seen as trying to find out what were the causes for some kind of observations. As it turns out, the “observation” can also be the result of another mathematical/computational model. In this way Bayesian inverse models can be used, for example, for upscaling and other multi-fidelity models.

On the computational side, it will be sketched on how this can be put into a computational framework. At one hand are the by now well-known Monte Carlo and Markov chain Monte Carlo (MCMC) methods, and on the other hand there are methods based on functional approximations. In this context one works with random variables instead of with probability measures, and the conditioning inherent in the Bayesian theory is expressed as a conditional expectation.

It will be demonstrated that the conditional expectation is based on orthogonal projections, a procedure which translated into numerical computations in a stable way. As the conditional expectation works on random variables, one may use this to construct a new (posterior) random variable (RV), which is a function of the prior or forecast RV and the actual observation, in other words a filter, such that this new RV has the proper Bayesian posterior distribution.

When one attemps to perform all of this numerically, it becomes clear that a number of approximations are necessary so that this becomes a practical procedure. It will be shown that well-known filters as the family of Kálman-like filters, which are based on the Gauss-Markov theorem, as well as variational approaches such as 3D or 4D VAR are some of the simplest examples of such filtering approaches