Integrated information in discrete dynamical systems: motivation and theoretical framework.

This paper introduces a time- and state-dependent measure of integrated information, phi, which captures the repertoire of causal states available to a system as a whole. Specifically, phi quantifies how much information is generated (uncertainty is reduced) when a system enters a particular state t...

Full description

Bibliographic Details
Main Authors: David Balduzzi, Giulio Tononi
Format: Article
Language:English
Published: Public Library of Science (PLoS) 2008-06-01
Series:PLoS Computational Biology
Online Access:http://europepmc.org/articles/PMC2386970?pdf=render
id doaj-15e7c8a8c0cc4b519df3be152694cd49
record_format Article
spelling doaj-15e7c8a8c0cc4b519df3be152694cd492020-11-25T01:32:26ZengPublic Library of Science (PLoS)PLoS Computational Biology1553-734X1553-73582008-06-0146e100009110.1371/journal.pcbi.1000091Integrated information in discrete dynamical systems: motivation and theoretical framework.David BalduzziGiulio TononiThis paper introduces a time- and state-dependent measure of integrated information, phi, which captures the repertoire of causal states available to a system as a whole. Specifically, phi quantifies how much information is generated (uncertainty is reduced) when a system enters a particular state through causal interactions among its elements, above and beyond the information generated independently by its parts. Such mathematical characterization is motivated by the observation that integrated information captures two key phenomenological properties of consciousness: (i) there is a large repertoire of conscious experiences so that, when one particular experience occurs, it generates a large amount of information by ruling out all the others; and (ii) this information is integrated, in that each experience appears as a whole that cannot be decomposed into independent parts. This paper extends previous work on stationary systems and applies integrated information to discrete networks as a function of their dynamics and causal architecture. An analysis of basic examples indicates the following: (i) phi varies depending on the state entered by a network, being higher if active and inactive elements are balanced and lower if the network is inactive or hyperactive. (ii) phi varies for systems with identical or similar surface dynamics depending on the underlying causal architecture, being low for systems that merely copy or replay activity states. (iii) phi varies as a function of network architecture. High phi values can be obtained by architectures that conjoin functional specialization with functional integration. Strictly modular and homogeneous systems cannot generate high phi because the former lack integration, whereas the latter lack information. Feedforward and lattice architectures are capable of generating high phi but are inefficient. (iv) In Hopfield networks, phi is low for attractor states and neutral states, but increases if the networks are optimized to achieve tension between local and global interactions. These basic examples appear to match well against neurobiological evidence concerning the neural substrates of consciousness. More generally, phi appears to be a useful metric to characterize the capacity of any physical system to integrate information.http://europepmc.org/articles/PMC2386970?pdf=render
collection DOAJ
language English
format Article
sources DOAJ
author David Balduzzi
Giulio Tononi
spellingShingle David Balduzzi
Giulio Tononi
Integrated information in discrete dynamical systems: motivation and theoretical framework.
PLoS Computational Biology
author_facet David Balduzzi
Giulio Tononi
author_sort David Balduzzi
title Integrated information in discrete dynamical systems: motivation and theoretical framework.
title_short Integrated information in discrete dynamical systems: motivation and theoretical framework.
title_full Integrated information in discrete dynamical systems: motivation and theoretical framework.
title_fullStr Integrated information in discrete dynamical systems: motivation and theoretical framework.
title_full_unstemmed Integrated information in discrete dynamical systems: motivation and theoretical framework.
title_sort integrated information in discrete dynamical systems: motivation and theoretical framework.
publisher Public Library of Science (PLoS)
series PLoS Computational Biology
issn 1553-734X
1553-7358
publishDate 2008-06-01
description This paper introduces a time- and state-dependent measure of integrated information, phi, which captures the repertoire of causal states available to a system as a whole. Specifically, phi quantifies how much information is generated (uncertainty is reduced) when a system enters a particular state through causal interactions among its elements, above and beyond the information generated independently by its parts. Such mathematical characterization is motivated by the observation that integrated information captures two key phenomenological properties of consciousness: (i) there is a large repertoire of conscious experiences so that, when one particular experience occurs, it generates a large amount of information by ruling out all the others; and (ii) this information is integrated, in that each experience appears as a whole that cannot be decomposed into independent parts. This paper extends previous work on stationary systems and applies integrated information to discrete networks as a function of their dynamics and causal architecture. An analysis of basic examples indicates the following: (i) phi varies depending on the state entered by a network, being higher if active and inactive elements are balanced and lower if the network is inactive or hyperactive. (ii) phi varies for systems with identical or similar surface dynamics depending on the underlying causal architecture, being low for systems that merely copy or replay activity states. (iii) phi varies as a function of network architecture. High phi values can be obtained by architectures that conjoin functional specialization with functional integration. Strictly modular and homogeneous systems cannot generate high phi because the former lack integration, whereas the latter lack information. Feedforward and lattice architectures are capable of generating high phi but are inefficient. (iv) In Hopfield networks, phi is low for attractor states and neutral states, but increases if the networks are optimized to achieve tension between local and global interactions. These basic examples appear to match well against neurobiological evidence concerning the neural substrates of consciousness. More generally, phi appears to be a useful metric to characterize the capacity of any physical system to integrate information.
url http://europepmc.org/articles/PMC2386970?pdf=render
work_keys_str_mv AT davidbalduzzi integratedinformationindiscretedynamicalsystemsmotivationandtheoreticalframework
AT giuliotononi integratedinformationindiscretedynamicalsystemsmotivationandtheoreticalframework
_version_ 1725082131426705408