Mean mutual information and symmetry breaking for finite random fields
2012
G. Edelman, O. Sporns and G. Tononi have introduced the neural complexity of a family of random variables, defining it as a specific average of mutual information over subfamilies. We show that their choice of weights satisfies two natural properties, namely invariance under permutations and additivity, and we call any functional satisfying these two properties an intricacy. We classify all intricacies in terms of probability laws on the unit interval and study the growth rate of maximal intricacies when the size of the system goes to infinity. For systems of a fixed size, we show that maximizers have small support and exchangeable systems have small intricacy. In particular, maximizing intricacy leads to spontaneous symmetry breaking and lack of uniqueness.
Keywords:
- Mathematical analysis
- Random element
- Explicit symmetry breaking
- Spontaneous symmetry breaking
- Combinatorics
- Symmetry breaking
- Quantum electrodynamics
- Mathematics
- Random field
- Convergence of random variables
- Algebra of random variables
- Mutual information
- Multivariate random variable
- Statistical physics
- Discrete mathematics
- Sum of normally distributed random variables
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
25
References
2
Citations
NaN
KQI