Information capacity of MIMO channels with relative entropy constraint
Ημερομηνία
2006ISBN
1-4244-0504-1978-1-4244-0504-6
Source
IEEE International Symposium on Information Theory - ProceedingsIEEE International Symposium on Information Theory - Proceedings
Pages
876-880Google Scholar check
Keyword(s):
Metadata
Εμφάνιση πλήρους εγγραφήςΕπιτομή
This paper addresses the issue of multiple-input multiple-output (MIMO) wireless channel capacity, when the probability distribution of the channel matrix p(H) is not completely known to the transmitter and the receiver. The partial knowledge of a true probability distribution of the channel matrix is modelled by using a relative entropy D(p∥g). All possible channel matrix distributions p(H) satisfy D(p∥pnom) ≤ d, d ≥ 0, i.e., they lie within d distance from the so-called nominal channel matrix distribution pnom(H). The information channel capacity is defined as a maximin optimization problem, where the mutual information is a pay-off function. The minimum is with respect to the channel matrix distribution, and the maximum is with respect to the covariance matrix of a transmitted signal. Based on the derived characteristics of the pay-off function, the formula for the channel matrix distribution, which minimizes the mutual information, is derived. In the case of the Rayleigh fading, the formula for the information capacity and the optimal transmit covariance matrix are obtained. In addition, the existence of the saddle point of the maximin optimization problem is established for this particular case. © 2006 IEEE.