Multiuser information theory has come to the forefront of the scientific community for more than five decades. The multi-terminal source coding problem in the vector Gaussian setting is investigated in this talk. The major efforts are devoted to characterize the rate distortion regions of some fundamental models, such as distributed source coding problem and multiple description coding problem.
First, we re-derive the rate distortion region of the vector Gaussian one-help-one problem. We present the entropy using Fisher information matrix, and derive a new extremal inequality based on the method of integration over a path of a continuous Gaussian perturbation. We then apply this extreme inequality to characterize the entire rate region of the vector Gaussian one-help-one problem.
Second, we introduce the problem of multiple description coding with the tree structured distortion constraints. In particular, a single-letter lower bound on the minimum sum rate is derived. For the vector Gaussian source setting with the covariance distortion measure constraints, it is shown that this lower bound coincides with the generalized El Gamal-Cover scheme.
Yinfei Xu received the B.E. and Ph.D. degrees in 2008 and 2016, respectively, both in Information Engineering, from Southeast University, Nanjing, China. Since March 2016, he has been in the Institute of Network Coding at The Chinese University of Hong Kong, Hong Kong, where he is currently a Postdoctoral Fellow. He was a visiting student in the Department of Electrical and Computer Engineering at McMaster University, Hamilton, ON, Canada, from July 2014 to January 2015. His research interests include information theory, signal processing and wireless communications.