TY - JOUR

T1 - Finite state channels with time-invariant deterministic feedback

AU - Permuter, Haim Henry

AU - Weissman, Tsachy

AU - Goldsmith, Andrea J.

N1 - Funding Information:
Manuscript received August 17, 2006; revised March 21, 2008. Current version published February 04, 2009. This work was supported by the National Science Foundation (NSF) under Grant CCR-0311633, by NSF CAREER grant, the U.S. Army under MURI award W911NF-05-1-0246, and by the ONR under award N00014-05-1-0168. The material in this paper was presented in part at IEEE International Symposium on Information (ISIT), Seattle, WA, July 2006.
Funding Information:
Prof. Weissman’s recent prizes include the NSF CAREER award and a Horev fellowship for Leaders in Science and Technology. He has been a Robert N. Noyce Faculty Scholar of the School of Engineering at Stanford, and is a recipient of the 2006 IEEE joint IT/COM Societies best paper award.

PY - 2009/3/4

Y1 - 2009/3/4

N2 - We consider capacity of discrete-time channels with feedback for the general case where the feedback is a time-invariant deterministic function of the output samples. Under the assumption that the channel states take values in a finite alphabet, we find a sequence of achievable rates and a sequence of upper bounds on the capacity. The achievable rates and the upper bounds are computable for any N, and the limits of the sequences exist. We show that when the probability of the initial state is positive for all the channel states, then the capacity is the limit of the achievable-rate sequence. We further show that when the channel is stationary, indecomposable, and has no intersymbol interference (ISI), its capacity is given by the limit of the maximum of the (normalized) directed information between the input XN and the output YN, i.e., C = lim N→∞ 1/N max I(XN → YN) where the maximization is taken over the causal conditioning probability Q(xN ∥zN-1) defined in this paper. The main idea for obtaining the results is to add causality into Gallager's results on finite state channels. The capacity results are used to show that the source-channel separation theorem holds for time-invariant determinist feedback, and if the state of the channel is known both at the encoder and the decoder, then feedback does not increase capacity.

AB - We consider capacity of discrete-time channels with feedback for the general case where the feedback is a time-invariant deterministic function of the output samples. Under the assumption that the channel states take values in a finite alphabet, we find a sequence of achievable rates and a sequence of upper bounds on the capacity. The achievable rates and the upper bounds are computable for any N, and the limits of the sequences exist. We show that when the probability of the initial state is positive for all the channel states, then the capacity is the limit of the achievable-rate sequence. We further show that when the channel is stationary, indecomposable, and has no intersymbol interference (ISI), its capacity is given by the limit of the maximum of the (normalized) directed information between the input XN and the output YN, i.e., C = lim N→∞ 1/N max I(XN → YN) where the maximization is taken over the causal conditioning probability Q(xN ∥zN-1) defined in this paper. The main idea for obtaining the results is to add causality into Gallager's results on finite state channels. The capacity results are used to show that the source-channel separation theorem holds for time-invariant determinist feedback, and if the state of the channel is known both at the encoder and the decoder, then feedback does not increase capacity.

KW - Causal conditioning

KW - Code-tree

KW - Directed information

KW - Feedback capacity

KW - Maximum likelihood

KW - Random coding

KW - Source-channel coding separation

UR - http://www.scopus.com/inward/record.url?scp=61349174057&partnerID=8YFLogxK

U2 - 10.1109/TIT.2008.2009849

DO - 10.1109/TIT.2008.2009849

M3 - Article

AN - SCOPUS:61349174057

VL - 55

SP - 644

EP - 662

JO - IEEE Transactions on Information Theory

JF - IEEE Transactions on Information Theory

SN - 0018-9448

IS - 2

ER -