How to Read Probability Distributions as Statements about Process

Probability distributions can be read as simple expressions of information. Each continuous probability distribution describes how information changes with magnitude. Once one learns to read a probability distribution as a measurement scale of information, opportunities arise to understand the proce...

Full description

Bibliographic Details
Main Author: Steven A. Frank
Format: Article
Language:English
Published: MDPI AG 2014-11-01
Series:Entropy
Subjects:
Online Access:http://www.mdpi.com/1099-4300/16/11/6059
id doaj-7a8ef23f1aa64d87bcb4b48164d8b0b3
record_format Article
spelling doaj-7a8ef23f1aa64d87bcb4b48164d8b0b32020-11-24T23:19:35ZengMDPI AGEntropy1099-43002014-11-0116116059609810.3390/e16116059e16116059How to Read Probability Distributions as Statements about ProcessSteven A. Frank0Department of Ecology & Evolutionary Biology, University of California, Irvine, CA 92697, USAProbability distributions can be read as simple expressions of information. Each continuous probability distribution describes how information changes with magnitude. Once one learns to read a probability distribution as a measurement scale of information, opportunities arise to understand the processes that generate the commonly observed patterns. Probability expressions may be parsed into four components: the dissipation of all information, except the preservation of average values, taken over the measurement scale that relates changes in observed values to changes in information, and the transformation from the underlying scale on which information dissipates to alternative scales on which probability pattern may be expressed. Information invariances set the commonly observed measurement scales and the relations between them. In particular, a measurement scale for information is defined by its invariance to specific transformations of underlying values into measurable outputs. Essentially all common distributions can be understood within this simple framework of information invariance and measurement scale.http://www.mdpi.com/1099-4300/16/11/6059measurementmaximum entropyinformation theorystatistical mechanicsextreme value distributionsneutral theories in biology
collection DOAJ
language English
format Article
sources DOAJ
author Steven A. Frank
spellingShingle Steven A. Frank
How to Read Probability Distributions as Statements about Process
Entropy
measurement
maximum entropy
information theory
statistical mechanics
extreme value distributions
neutral theories in biology
author_facet Steven A. Frank
author_sort Steven A. Frank
title How to Read Probability Distributions as Statements about Process
title_short How to Read Probability Distributions as Statements about Process
title_full How to Read Probability Distributions as Statements about Process
title_fullStr How to Read Probability Distributions as Statements about Process
title_full_unstemmed How to Read Probability Distributions as Statements about Process
title_sort how to read probability distributions as statements about process
publisher MDPI AG
series Entropy
issn 1099-4300
publishDate 2014-11-01
description Probability distributions can be read as simple expressions of information. Each continuous probability distribution describes how information changes with magnitude. Once one learns to read a probability distribution as a measurement scale of information, opportunities arise to understand the processes that generate the commonly observed patterns. Probability expressions may be parsed into four components: the dissipation of all information, except the preservation of average values, taken over the measurement scale that relates changes in observed values to changes in information, and the transformation from the underlying scale on which information dissipates to alternative scales on which probability pattern may be expressed. Information invariances set the commonly observed measurement scales and the relations between them. In particular, a measurement scale for information is defined by its invariance to specific transformations of underlying values into measurable outputs. Essentially all common distributions can be understood within this simple framework of information invariance and measurement scale.
topic measurement
maximum entropy
information theory
statistical mechanics
extreme value distributions
neutral theories in biology
url http://www.mdpi.com/1099-4300/16/11/6059
work_keys_str_mv AT stevenafrank howtoreadprobabilitydistributionsasstatementsaboutprocess
_version_ 1725578201062703104