A Synergetic Theory of Information

A new approach is presented to defining the amount of information, in which information is understood as the data about a finite set as a whole, whereas the average length of an integrative code of elements serves as a measure of information. In the framework of this approach, the formula for the sy...

Full description

Bibliographic Details
Main Author: Viktor Vyatkin
Format: Article
Language:English
Published: MDPI AG 2019-04-01
Series:Information
Subjects:
Online Access:https://www.mdpi.com/2078-2489/10/4/142
id doaj-64b75c17c38e498fb25e8ef52ef14fb7
record_format Article
spelling doaj-64b75c17c38e498fb25e8ef52ef14fb72020-11-24T22:19:07ZengMDPI AGInformation2078-24892019-04-0110414210.3390/info10040142info10040142A Synergetic Theory of InformationViktor Vyatkin0Independent Researcher, 3-82 Papanina St., 620077 Ekaterinburg, RussiaA new approach is presented to defining the amount of information, in which information is understood as the data about a finite set as a whole, whereas the average length of an integrative code of elements serves as a measure of information. In the framework of this approach, the formula for the syntropy of a reflection was obtained for the first time, that is, the information which two intersecting finite sets reflect (reproduce) about each other. Features of a reflection of discrete systems through a set of their parts are considered and it is shown that reproducible information about the system (the additive syntropy of reflection) and non-reproducible information (the entropy of reflection) are, respectively, measures of the structural order and the chaos. At that, the general classification of discrete systems is given by the ratio of the order and the chaos. Three information laws have been established: The law of conservation of the sum of chaos and order; the information law of reflection; and the law of conservation and transformation of information. An assessment of the structural organization and the level of development of discrete systems is presented. It is shown that various measures of information are structural characteristics of integrative codes of elements of discrete systems. A conclusion is made that, from the information-genetic positions, the synergetic approach to the definition of the quantity of information is primary in relation to the approaches of Hartley and Shannon.https://www.mdpi.com/2078-2489/10/4/142syntropyentropychaosorderamount of informationfinite setintegrative code
collection DOAJ
language English
format Article
sources DOAJ
author Viktor Vyatkin
spellingShingle Viktor Vyatkin
A Synergetic Theory of Information
Information
syntropy
entropy
chaos
order
amount of information
finite set
integrative code
author_facet Viktor Vyatkin
author_sort Viktor Vyatkin
title A Synergetic Theory of Information
title_short A Synergetic Theory of Information
title_full A Synergetic Theory of Information
title_fullStr A Synergetic Theory of Information
title_full_unstemmed A Synergetic Theory of Information
title_sort synergetic theory of information
publisher MDPI AG
series Information
issn 2078-2489
publishDate 2019-04-01
description A new approach is presented to defining the amount of information, in which information is understood as the data about a finite set as a whole, whereas the average length of an integrative code of elements serves as a measure of information. In the framework of this approach, the formula for the syntropy of a reflection was obtained for the first time, that is, the information which two intersecting finite sets reflect (reproduce) about each other. Features of a reflection of discrete systems through a set of their parts are considered and it is shown that reproducible information about the system (the additive syntropy of reflection) and non-reproducible information (the entropy of reflection) are, respectively, measures of the structural order and the chaos. At that, the general classification of discrete systems is given by the ratio of the order and the chaos. Three information laws have been established: The law of conservation of the sum of chaos and order; the information law of reflection; and the law of conservation and transformation of information. An assessment of the structural organization and the level of development of discrete systems is presented. It is shown that various measures of information are structural characteristics of integrative codes of elements of discrete systems. A conclusion is made that, from the information-genetic positions, the synergetic approach to the definition of the quantity of information is primary in relation to the approaches of Hartley and Shannon.
topic syntropy
entropy
chaos
order
amount of information
finite set
integrative code
url https://www.mdpi.com/2078-2489/10/4/142
work_keys_str_mv AT viktorvyatkin asynergetictheoryofinformation
AT viktorvyatkin synergetictheoryofinformation
_version_ 1725779967541772288