## Relations between information theory, robustness, and statistical mechanics of stochastic systems

##### Date

2004##### Source

Proceedings of the IEEE Conference on Decision and ControlProceedings of the IEEE Conference on Decision and Control

##### Volume

4##### Pages

3479-3484Google Scholar check

##### Keyword(s):

##### Metadata

Show full item record##### Abstract

The fundamental question, which will be addressed in this talk are the relations between dissipation, which is a concept of robustness, entropy rate, which is a concept of information theory, and statistical mechanics. Dissipation is a concept which is used in the theory and applications of robustness of filtering and control of uncertain systems. In thermodynamics, when a system is not in equilibrium with its surroundings there exists a potential of producing useful work. Dissipation is the part of this potential that is not tranformable to useful work. On the other hand, entropy is fundamental concept on which information theory and in general telecommunication systems are founded on. Entropy rate is a macroscopic property of thermodynamic systems, that quantifies dissipation through the Clausius inequality and irreversible processes. In addition entropy measures the number of microstates, different configurations of the phase space, that correspond to a thermodynamic macrostate of certain entropy value. In this presentation statistical mechanics concepts will be used to bring about the close relationship between entropy and dissipation and in particular, the implication of this relationship, in computing the induced norm associated with disturbance attenuation problems.