Entropy, Shannon’s Measure of Information and Boltzmann’s H-Theorem
We start with a clear distinction between Shannon’s Measure of Information (SMI) and the Thermodynamic Entropy. The first is defined on any probability distribution; and therefore it is a very general concept. On the other hand Entropy is defined on a very special set of distributions. Next we show...
Main Author: | |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2017-01-01
|
Series: | Entropy |
Subjects: | |
Online Access: | http://www.mdpi.com/1099-4300/19/2/48 |
id |
doaj-5c1eb22b3a4741ca8130d7d41df6a3bd |
---|---|
record_format |
Article |
spelling |
doaj-5c1eb22b3a4741ca8130d7d41df6a3bd2020-11-24T21:46:34ZengMDPI AGEntropy1099-43002017-01-011924810.3390/e19020048e19020048Entropy, Shannon’s Measure of Information and Boltzmann’s H-TheoremArieh Ben-Naim0Department of Physical Chemistry, The Hebrew University of Jerusalem, Jerusalem 91904, IsraelWe start with a clear distinction between Shannon’s Measure of Information (SMI) and the Thermodynamic Entropy. The first is defined on any probability distribution; and therefore it is a very general concept. On the other hand Entropy is defined on a very special set of distributions. Next we show that the Shannon Measure of Information (SMI) provides a solid and quantitative basis for the interpretation of the thermodynamic entropy. The entropy measures the uncertainty in the distribution of the locations and momenta of all the particles; as well as two corrections due to the uncertainty principle and the indistinguishability of the particles. Finally we show that the H-function as defined by Boltzmann is an SMI but not entropy. Therefore; much of what has been written on the H-theorem is irrelevant to entropy and the Second Law of Thermodynamics.http://www.mdpi.com/1099-4300/19/2/48entropyShannon’s measure of informationSecond Law of ThermodynamicsH-theorem |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Arieh Ben-Naim |
spellingShingle |
Arieh Ben-Naim Entropy, Shannon’s Measure of Information and Boltzmann’s H-Theorem Entropy entropy Shannon’s measure of information Second Law of Thermodynamics H-theorem |
author_facet |
Arieh Ben-Naim |
author_sort |
Arieh Ben-Naim |
title |
Entropy, Shannon’s Measure of Information and Boltzmann’s H-Theorem |
title_short |
Entropy, Shannon’s Measure of Information and Boltzmann’s H-Theorem |
title_full |
Entropy, Shannon’s Measure of Information and Boltzmann’s H-Theorem |
title_fullStr |
Entropy, Shannon’s Measure of Information and Boltzmann’s H-Theorem |
title_full_unstemmed |
Entropy, Shannon’s Measure of Information and Boltzmann’s H-Theorem |
title_sort |
entropy, shannon’s measure of information and boltzmann’s h-theorem |
publisher |
MDPI AG |
series |
Entropy |
issn |
1099-4300 |
publishDate |
2017-01-01 |
description |
We start with a clear distinction between Shannon’s Measure of Information (SMI) and the Thermodynamic Entropy. The first is defined on any probability distribution; and therefore it is a very general concept. On the other hand Entropy is defined on a very special set of distributions. Next we show that the Shannon Measure of Information (SMI) provides a solid and quantitative basis for the interpretation of the thermodynamic entropy. The entropy measures the uncertainty in the distribution of the locations and momenta of all the particles; as well as two corrections due to the uncertainty principle and the indistinguishability of the particles. Finally we show that the H-function as defined by Boltzmann is an SMI but not entropy. Therefore; much of what has been written on the H-theorem is irrelevant to entropy and the Second Law of Thermodynamics. |
topic |
entropy Shannon’s measure of information Second Law of Thermodynamics H-theorem |
url |
http://www.mdpi.com/1099-4300/19/2/48 |
work_keys_str_mv |
AT ariehbennaim entropyshannonsmeasureofinformationandboltzmannshtheorem |
_version_ |
1725901303982325760 |