Show that information is equivalent to negentropy, and the increase in entropy is always greater than the amount of information received (the negentropic interpretation of information was proposed by L. Szilard in 1927). Based on the indicated equivalence, calculate information about a system of two gases with the number of molecules N1 and N2, respectively, determining the change in entropy when mixing gases. Note. The amount of information I about the state of the system is determined by the relation I = log2P, where P is the number of different equal
Detailed solution. Format gif
No feedback yet