Login [Center] Logout Join Us Guidelines  I  中文  I  CQI

Shannon's Information Measures and Markov Structures

Speaker: Prof. Raymond Yeung
Time: 2016-11-09 14:00-2016-11-09 15:00
Venue: FIT 1-222

Abstract:

 

Most studies of finite Markov random fields assume that the underlying probability mass function (pmf) of the random variables is strictly positive. With this assumption, the pmf takes the form of a Gibbs measure which possesses many nice properties.

In general, non-strictly positive pmf’s have very complicated conditional independence structure and are difficult to handle. An alleviation of this difficulty is to use conditional mutual information to characterize conditional mutual independencies. Specifically, for random variables X, Y, and Z, X and Y are independent conditioning on Z if and only if I(X;Y|Z)=0, regardless of whether the underlying pmf is strictly positive or not.

In the 1990’s, the theory of I-Measure was developed as a full-fledged set-theoretic interpretation of Shannon’s information measures. In this talk, we first give an overview of this theory. Then we discuss a set of tools developed on the I-Measure that is most suitable for studying a special Markov structure called full conditional mutual independence (FCMI), which turns out to be a building block for Markov random fields. One application of these tools is to show that the I-Measure of a Markov chain (a special case of a Markov random field) exhibits a very simple structure and is always nonnegative.

In the last part of the talk, we discuss some recent results along this line: i. a characterization of the Markov structure of a subfield of a Markov random field; ii. the Markov chain being the only Markov random field such that the I-Measure is always nonnegative.

Short Bio:

Prof. Yeung was a member of the Board of Governors of the IEEE Information Theory Society in 1999-2001. He has served on the committees of a number of information theory symposiums, and is currently on the editorial board of a few international journals.. He is the author of the books A First Course in Information Theory (Springer 2002) and its revision Information Theory and Network Coding (Springer 2008).  He is the recipients of a number of awards, including the Croucher Senior Research Fellowship for 2000/01, the 2005 IEEE Information Theory Society Paper Award, the Friedrich Wilhelm Bessel Research Award from the Alexander von Humboldt Foundation in 2007, and the 2016 IEEE Eric E. Sumner Award. In 2015, he was named an Outstanding Overseas Chinese Information Theorist by the China Information Theory Society. His research interest is in information theory and network coding.