The new Entropic Information approach
Information notion and definition
in simple terms can be described as collection of facts, knowledge or data.
It is derived from the Latin word informare which means to give form to mind.
Following Shannon in his celebrated 1948 work that started the development of information theory, the number of states in which something can be, is precisely the definition of “information” (more precisely, “lack of information).
Statistical entropy is a probabilistic measure of uncertainty or ignorance; information is a measure of a reduction in that uncertainty”.
Heylighen F, Joslyn C. Cybernetics and second order cybernetics.
Following the new Entropic Information Theory, the entropy of a thermodynamic system in equilibrium measure of the uncertainty as to which all its internal configurations compatible with its macroscopic thermodynamic parameters (temperature, pressure, etc.) are actually realized.
View based on the perspective where, the fundamental building block of our universe is the entangled quantum information.
Where, the number of bits of the system is the number of bits necessary to specify the actual microscopic configuration among the total number of microstates allowed and thus characterize the macroscopic states of the system under consideration.
That is why :
We can say that entropy, is information: indeed, in the micro-canonical language entropy is determined by the number of microstates compatible with a given macrostate.
" the informational properties of a system should be dependent only on the number of distinct messages that can be encoded on that system."
"Information is something physical that is encoded in the state of a quantum system."