期刊论文详细信息
Information
Using Information Theory to Study Efficiency and Capacity of Computers and Similar Devices
关键词: computer capacity;    computer efficiency;    information theory;    Shannon entropy;    channel capacity;   
DOI  :  10.3390/info1010003
来源: mdpi
PDF
【 摘 要 】

We address the problem of estimating the efficiency and capacity of computers. The main goal of our approach is to give a method for comparing the capacity of different computers, which can have different sets of instructions, different kinds of memory, a different number of cores (or processors), etc. We define efficiency and capacity of computers and suggest a method for their estimation, which is based on the analysis of processor instructions and their execution time. How the suggested method can be applied to estimate the computer capacity is shown. In particular, this consideration gives a new look at the organization of the memory of a computer. Obtained results can be of some interest for practical applications.

【 授权许可】

CC BY   
© 2010 by the authors; licensee MDPI, Basel, Switzerland.

【 预 览 】
附件列表
Files Size Format View
RO202003190052501ZK.pdf 232KB PDF download
  文献评价指标  
  下载次数:10次 浏览次数:35次