| Date: | Thu, 06 Dec 2018 10:45:04 -0600 |
|---|---|
| From: | Lucas Morton <lamorton@xxxxxxxx> |
| Subject: | [AIRG] Higher-arity mutual information |
|
Hi all, Here are the references I've found on extending
measures of redundancy/information beyond two variables using the concept of the maximum-entropy distribution subject to N-variable marginal distributions (N>1). Interestingly,
this wheel has been invented several times. [1]Â There are two kinds of papers: those (by Amari and by Schniedman et al) that constrained all the marginals up to order N, and those that constrained proper subsets of the marginals up to a given order. Of the latter group, Bertschinger et al and Griffith & Koch approached the issue by considering two-variable marginals, with one variable always being the designated 'target' variable. Cavallo & Pittarelli, while presenting a more general framework, did not make connections outside the realm of databases.
Many of the references within are worth reading for background. The
Beer-Williams "Partial Information Decomposition" and McGill's
(non-positive) "interaction information" are other attempts to construct a multivariate generalization of mutual information.
[1] (...whereas Newton could say, "If I have seen a little farther than
others, it is because I have stood on the shoulders of giants," I am
forced to say, "Today we stand on each other's feet." - Richard Hamming)
--------------------------------------------------- Lucas A. Morton Research Associate Department of Engineering Physics University of Wisconsin - Madison |
| [← Prev in Thread] | Current Thread | [Next in Thread→] |
|---|---|---|
| ||
| Previous by Date: | Re: [AIRG] Resource Constrained Neural Network Architecture Search, Aubrey Barnard |
|---|---|
| Next by Date: | [AIRG] AIRG 12/10: Graph Convolutions, DAVID MERRELL |
| Previous by Thread: | [AIRG] Fall 2018 wrap-up, Aubrey Barnard |
| Next by Thread: | [AIRG] Resource Constrained Neural Network Architecture Search, Yunyang Xiong |
| Indexes: | [Date] [Thread] |