museumgugl.blogg.se

Relative entropy
Relative entropy






In all cases the inverse results match quite closely the actual density profile, at least in the upper portions of the profile. An example inverse problem involving a gravity survey over a layered and faulted zone is shown. Menke's weighted minimum length solution was shown to have a basis in information theory, and the classic least-squares estimate is shown as a solution to MRE under the conditions of more data than unknowns and where we utilize the observed data and their associated noise. Various solutions of Jacobs & van der Geest were repeated and clarified. A variety of MRE solutions are reproduced under a number of assumed moments and these include second-order central moments. The MRE approach can accommodate such uncertainty and in new work described here, previous results are modified to include a Gaussian prior. In general, since input data are known only to within a certain accuracy, it is important that any inversion method should allow for errors in the measured data.

relative entropy

In this way, MRE is maximally uncommitted with respect to unknown information. It is important to note that MRE makes a strong statement that the imposed constraints are exact and complete. It will use moments of the prior distribution only if new data on these moments is not available.

#Relative entropy pdf#

MRE attempts to produce this pdf based on the information provided by new moments.

relative entropy

MRE like Bayes, presumes knowledge of a prior probability distribution and produces the posterior pdf itself. MRE differs from both Bayes and classical statistical methods in that knowledge of moments are used as ‘data’ rather than sample values. The development of each approach is illustrated through the general-discrete linear inverse. The focus of this paper is to illustrate important philosophies on inversion and the similarly and differences between Bayesian and minimum relative entropy (MRE) methods.






Relative entropy