Parameter space

The parameter space is the space of possible parameter values that define a particular mathematical model. It is also sometimes called weight space, and is often a subset of finite-dimensional Euclidean space.

In statistics, parameter spaces are particularly useful for describing parametric families of probability distributions. They also form the background for parameter estimation. In the case of extremum estimators for parametric models, a certain objective function is maximized or minimized over the parameter space.[1] Theorems of existence and consistency of such estimators require some assumptions about the topology of the parameter space. For instance, compactness of the parameter space, together with continuity of the objective function, suffices for the existence of an extremum estimator.[1]

In Deep Learning, the parameters of a deep network are called weights. Due to the layered structure of deep networks, their weight space has a complex structure and geometry.[2][3] For example, in Multilayer Perceptrons, the same function is preserved when permuting the nodes of a hidden layer, amounting to permuting weight matrices of the network. This property is known as equivariance to permutation of deep weight spaces.[2]

Sometimes, parameters are analyzed to view how they affect their statistical model. In that context, they can be viewed as inputs of a function, in which case the technical term for the parameter space is domain of a function. The ranges of values of the parameters may form the axes of a plot, and particular outcomes of the model may be plotted against these axes to illustrate how different regions of the parameter space produce different types of behavior in the model.

  1. ^ a b Hayashi, Fumio (2000). Econometrics. Princeton University Press. p. 446. ISBN 0-691-01018-8.
  2. ^ a b Navon, Aviv; Shamsian, Aviv; Achituve, Idan; Fetaya, Ethan; Chechik, Gal; Maron, Haggai (2023-07-03). "Equivariant Architectures for Learning in Deep Weight Spaces". Proceedings of the 40th International Conference on Machine Learning. PMLR: 25790–25816.
  3. ^ Hecht-Nielsen, Robert (1990-01-01), Eckmiller, Rolf (ed.), "ON THE ALGEBRAIC STRUCTURE OF FEEDFORWARD NETWORK WEIGHT SPACES", Advanced Neural Computers, Amsterdam: North-Holland, pp. 129–135, ISBN 978-0-444-88400-8, retrieved 2023-12-01

© MMXXIII Rich X Search. We shall prevail. All rights reserved. Rich X Search