Sequential Necessary and Sufficient Conditions for Capacity Achieving Distributions of Channels with Memory and Feedback
Date
2017Source
IEEE Transactions on Information TheoryVolume
63Issue
11Pages
7095-7115Google Scholar check
Keyword(s):
Metadata
Show full item recordAbstract
We derive sequential necessary and sufficient conditions for any channel input conditional distribution P0,n δ {PXt |Xt?1,Yt?1 : T = 0, . . . , n} to maximize the finite-Time horizon directed information defined by CFB Xn∞Yn δ supP0,n I (Xn ∞ Yn), where I (Xn ∞ Yn) = nt =0 I (Xt ; Yt |Yt?1), for channel distributions {PYt |Yt?1,Xt : t = 0, . . . , n} and {PYt |Yt?1 t?M,Xt : T = 0, . . . , n}, where Ytδ {Y?1, Y0, . . . , Yt } and Xt - {X0, . . . , Xt } are the channel input and output random processes, and M is a finite non-negative integer. We apply the necessary and sufficient conditions to application examples of time-varying channels with memory to derive recursive closed form expressions of the optimal distributions, which maximize the finite-Time horizon directed information. Furthermore, we derive the feedback capacity from the asymptotic properties of the optimal distributions by investigating the limit CFB X∞∞Y∞ δ limn?∞∞(1/(n + 1))CFB Xn∞Yn without any á priori assumptions, such as stationarity, ergodicity, or irreducibility of the channel distribution. The framework based on sequential necessary and sufficient conditions can be easily applied to a variety of channels with memory, beyond the ones considered in this paper. © 1963-2012 IEEE.