Using Information Theory to Study Efficiency and Capacity of Computers and Similar Devices

We address the problem of estimating the efficiency and capacity of computers. The main goal of our approach is to give a method for comparing the capacity of different computers, which can have different sets of instructions, different kinds of memory, a different number of cores (or processors), e...

Full description

Bibliographic Details
Main Author: Boris Ryabko
Format: Article
Language:English
Published: MDPI AG 2010-08-01
Series:Information
Subjects:
Online Access:http://www.mdpi.com/2078-2489/1/1/3/
id doaj-d9bcc7ad09da4cfea0b5533cab57bc2b
record_format Article
spelling doaj-d9bcc7ad09da4cfea0b5533cab57bc2b2020-11-25T02:38:40ZengMDPI AGInformation2078-24892010-08-011131210.3390/info1010003Using Information Theory to Study Efficiency and Capacity of Computers and Similar DevicesBoris RyabkoWe address the problem of estimating the efficiency and capacity of computers. The main goal of our approach is to give a method for comparing the capacity of different computers, which can have different sets of instructions, different kinds of memory, a different number of cores (or processors), etc. We define efficiency and capacity of computers and suggest a method for their estimation, which is based on the analysis of processor instructions and their execution time. How the suggested method can be applied to estimate the computer capacity is shown. In particular, this consideration gives a new look at the organization of the memory of a computer. Obtained results can be of some interest for practical applications. http://www.mdpi.com/2078-2489/1/1/3/computer capacitycomputer efficiencyinformation theoryShannon entropychannel capacity
collection DOAJ
language English
format Article
sources DOAJ
author Boris Ryabko
spellingShingle Boris Ryabko
Using Information Theory to Study Efficiency and Capacity of Computers and Similar Devices
Information
computer capacity
computer efficiency
information theory
Shannon entropy
channel capacity
author_facet Boris Ryabko
author_sort Boris Ryabko
title Using Information Theory to Study Efficiency and Capacity of Computers and Similar Devices
title_short Using Information Theory to Study Efficiency and Capacity of Computers and Similar Devices
title_full Using Information Theory to Study Efficiency and Capacity of Computers and Similar Devices
title_fullStr Using Information Theory to Study Efficiency and Capacity of Computers and Similar Devices
title_full_unstemmed Using Information Theory to Study Efficiency and Capacity of Computers and Similar Devices
title_sort using information theory to study efficiency and capacity of computers and similar devices
publisher MDPI AG
series Information
issn 2078-2489
publishDate 2010-08-01
description We address the problem of estimating the efficiency and capacity of computers. The main goal of our approach is to give a method for comparing the capacity of different computers, which can have different sets of instructions, different kinds of memory, a different number of cores (or processors), etc. We define efficiency and capacity of computers and suggest a method for their estimation, which is based on the analysis of processor instructions and their execution time. How the suggested method can be applied to estimate the computer capacity is shown. In particular, this consideration gives a new look at the organization of the memory of a computer. Obtained results can be of some interest for practical applications.
topic computer capacity
computer efficiency
information theory
Shannon entropy
channel capacity
url http://www.mdpi.com/2078-2489/1/1/3/
work_keys_str_mv AT borisryabko usinginformationtheorytostudyefficiencyandcapacityofcomputersandsimilardevices
_version_ 1724790403300851712