![]() | This is not a Wikipedia article: It is an individual user's work-in-progress page, and may be incomplete and/or unreliable. For guidance on developing this draft, see Wikipedia:So you made a userspace draft. Find sources: Google (books · news · scholar · free images · WP refs) · FENS · JSTOR · TWL |
Maximally Informative Dimensions, or MID, refers to a computational method based on information theoretic principles used to calculate a neuron's receptive field. Maximizing information refers to maximizing the Kullbeck-Leibler divergence between two probability distributions - the prior distribution of reduced stimuli and the distribution of reduced stimuli conditional on the neuronal response - by means of appropriately selecting a neuron's receptive field[1]. When analyzing neural responses, Shannon's mutual information becomes useful because it provides a rigorous way of comparing the two probability distributions[1]. As an optimization scheme, MID has the advantage that it does not depend on any specific statistical properties of the stimulus ensemble.
© MMXXIII Rich X Search. We shall prevail. All rights reserved. Rich X Search