Likelihood function

The likelihood function (often simply called the likelihood) is the joint probability mass (or probability density) of observed data viewed as a function of the parameters of a statistical model.[1][2][3] Intuitively, the likelihood function is the probability of observing data assuming is the actual parameter.

In maximum likelihood estimation, the arg max (over the parameter ) of the likelihood function serves as a point estimate for , while the Fisher information (often approximated by the likelihood's Hessian matrix) indicates the estimate's precision.

In contrast, in Bayesian statistics, parameter estimates are derived from the converse of the likelihood, the so-called posterior probability, which is calculated via Bayes' rule.[4]

  1. ^ Casella, George; Berger, Roger L. (2002). Statistical Inference (2nd ed.). Duxbury. p. 290. ISBN 0-534-24312-6.
  2. ^ Wakefield, Jon (2013). Frequentist and Bayesian Regression Methods (1st ed.). Springer. p. 36. ISBN 978-1-4419-0925-1.
  3. ^ Lehmann, Erich L.; Casella, George (1998). Theory of Point Estimation (2nd ed.). Springer. p. 444. ISBN 0-387-98502-6.
  4. ^ Zellner, Arnold (1971). An Introduction to Bayesian Inference in Econometrics. New York: Wiley. pp. 13–14. ISBN 0-471-98165-6.

© MMXXIII Rich X Search. We shall prevail. All rights reserved. Rich X Search