The Chinese University of Hong Kong
Apr 15-17, 2013
If information theory is about (Shannon) entropy then "network information theory" is about "joint entropy".
Constraints on joint entropy, mostly in the form of information inequalities have long been a subject of interest in information theory. For a long time, the tools for proving such inequalities had been the nonnegativity of mutual information, and inequalities that can be proved as such are collectively called Shannon-type inequalities. The first unconstrained non-Shannon-type inequality discovered in 1998 by Zhang and Yeung reveals the incompleteness of Shannon inequalities.
Since then, determining the set of inequalities that joint entropy satisfies, and thus describing the so-called space of entropic vectors, has emerged as a major challenge in information theory — one with far-reaching implications and applications. For example, it is now recognized that knowledge of the space of entropic vectors reduces the most general network coding problems on acyclic networks (which are all open) to convex optimization. The study of joint entropy has been shown to have surprising connections to some (seemingly) unrelated areas of mathematics and computer science including matroid theory, group theory, combinatorics, determinantal inequalities, and Kolmogorov complexity. The study of joint entropy has also inspired the pursuit of new constraints on the von Neumann entropy in quantum mechanics. The area has proven to be very rich and the problems quite challenging and fundamental. See [1] for a tutorial and survey on the subject.
The goal of this workshop is to bring together researchers who are active, and/or interested, in this area, to survey the progress that has been made, to describe some of the latest results, and to discuss avenues and approaches for future attacks. The workshop will include a small set of tutorial presentations, various research talks and ample time for interactive discussion and reflection.