Likelihood function

The likelihood function (often simply called the likelihood) is the joint probability mass (or probability density) of observed data, but viewed as a function of the parameters of a statistical model.[1][2][3] That is, the likelihood function , which gives the likelihood of a vector of parameters under the assumption that a set of observed data is true, is numerically the same as the probability function , which gives the probability of a set of data under the assumption that a vector of parameters is true.

In maximum likelihood estimation, the arg max (over the parameter ) of the likelihood function serves as a point estimate for , while the Fisher information (often approximated by the likelihood's Hessian matrix) is an indication of the estimate's precision.

In contrast, in Bayesian statistics, parameter estimates are derived from the converse of the likelihood, via a posterior probability calculated via Bayes' rule.[4]

  1. ^ Casella, George; Berger, Roger L. (2002). Statistical Inference (2nd ed.). Duxbury. p. 290. ISBN 0-534-24312-6.
  2. ^ Wakefield, Jon (2013). Frequentist and Bayesian Regression Methods (1st ed.). Springer. p. 36. ISBN 978-1-4419-0925-1.
  3. ^ Lehmann, Erich L.; Casella, George (1998). Theory of Point Estimation (2nd ed.). Springer. p. 444. ISBN 0-387-98502-6.
  4. ^ Zellner, Arnold (1971). An Introduction to Bayesian Inference in Econometrics. New York: Wiley. pp. 13–14. ISBN 0-471-98165-6.

© MMXXIII Rich X Search. We shall prevail. All rights reserved. Rich X Search