User:Dnlbreen/Maximally Informative Dimensions (MID)

Maximally Informative Dimensions, or MID, refers to a computational method based on information theoretic principles used to calculate a neuron's receptive field. Maximizing information refers to maximizing the Kullbeck-Leibler divergence between two probability distributions - the prior distribution of reduced stimuli and the distribution of reduced stimuli conditional on the neuronal response - by means of appropriately selecting a neuron's receptive field[1]. When analyzing neural responses, Shannon's mutual information becomes useful because it provides a rigorous way of comparing the two probability distributions[1]. As an optimization scheme, MID has the advantage that it does not depend on any specific statistical properties of the stimulus ensemble.

  1. ^ a b Sharpee, T.; Rust, N. C.; Bialek, W.; Analyzing neural responses to natural signals: Maximally informative dimensions, Neural Computation, 16(2), 223-250, 2004

© MMXXIII Rich X Search. We shall prevail. All rights reserved. Rich X Search