background preloader

Information theory

Facebook Twitter

Mutual information. Mutual information is one of many quantities that measures how much one random variables tells us about another.

Mutual information

It is a dimensionless quantity with (generally) units of bits, and can be thought of as the reduction in uncertainty about one random variable given knowledge of another. High mutual information indicates a large reduction in uncertainty; low mutual information indicates a small reduction; and zero mutual information between two random variables means the variables are independent. Definition For two discrete variables X and Y whose joint probability distribution is P_{XY}(x,y)\ , the mutual information between them, denoted I(X;Y)\ , is given by (Shannon and Weaver, 1949; Cover and Thomas, 1991) \tag{1} I(X;Y) = \sum_{x,y} P_{XY}(x,y) \log {P_{XY}(x,y) \over P_X(x) P_Y(y)} = E_{P_{XY}} \log{P_{XY} \over P_X P_Y} \, .

Information Geometry. Information Geometry. This volume will be useful to practising scientists and students working in the application of statistical models to real materials or to processes with perturbations of a Poisson process, a uniform process, or a state of independence for a bivariate process.

Information Geometry

We use information geometry to provide a common differential geometric framework for a wide range of illustrative applications including amino acid sequence spacings in protein chains, cryptology studies, clustering of communications and galaxies, cosmological voids, coupled spatial statistics in stochastic fibre networks and stochastic porous media, quantum chaology.

Introduction sections are provided to mathematical statistics, differential geometry and the information geometry of spaces of probability density functions. Content Level » Research Keywords » 53B50, 60D05, 62B10, 62P35, 74E35, 92D20 - Gamma models - Independence perturbation - Information geometry - Inter-event spacing - Poisson perturbation Show all authors. Prof. JUN ZHANG, Department of Psychology University of Michigan. PUBLICATIONS BY TOPICS (click on links below) 1. sensory encoding (receptive field, tuning, map, etc) 2. visual perception (binding, object segregation, motion, etc) 1. stimulus-response compatibility (Fitts, Simon, Erikson, Hedge-Marsh, etc) 2.

Prof. JUN ZHANG, Department of Psychology University of Michigan

“decision” process (sensory-motor locus of neuron, stimulus/response components in ERP) 1. reinforcement learning (dopamine, TD learning, action policy, etc) 2. classification and regularized learning (reproducing kernels, feature map, etc) 1. theory-of-mind recursive reasoning (“I think you think I think …”) 2. game theory (prisoner’s dilemma, meta-game, rationality, etc) Ø MATHEMATICAL PSYCHOLOGY (various topics) 1. signal detection theory 2. individual and social choice 3. model selection 4. dynamical system 5. neural network Zhang, J. and Miller, J.P. (1989).