# Basic Ideas of Information Thermodynamics

Venue: CTS, Husova 4, Praha 1, 3. patro

Lecturers:
Bohdan Hejna

We apply a certain unifying physical description of the results of Information Theory.
Assuming that heat entropy is a thermodynamic realization of information entropy, we construct a cyclical, thermodynamic, average-value model of an information transfer chain as a general heat engine, in particular a Carnot engine, reversible ar irreversible. A working medium of the cycle (a thermodynamic system transforming input heat energy) can be considered as a thermodynamic, average-value model or, as such, as a realization of an information transfer channel. We show that for a model realized in this way the extended II. Principle of Thermodynamics is valid and we formulate its information form.
Also we solve the problem of a proof of II. Principle of Thermodynamics. We state the relation
between the term of information entropy, introduced by C. Shannon (1948), and thermodynamic
entropy, introduced by R. Clausius (1850) and, further, explain Gibbs paradox. Our way to deal
with the given topic is a connection of both the mathematical definitions of information entropies and their mutual relations within a system of stochastic quantities especially with thermodynamic entropies defined on an isolated system in which a realization of our (repeatable) observation is performed [it is a (cyclic) transformation of heat energy of an observed, measured system].
We use the information description to analyze Gibbs paradox reasoning it as a property of such
observation, measuring of an (equilibrium) thermodynamic system. We state a logical proof of the II. P.T. as a derivation of relations among the entropies of a system of stochastic variables, realized physically, and, the Equivalence Principle of the I., II. and III. Principle of Thermodynamics is formulated.