期刊论文详细信息
Communications
Various Approaches Proposed for Eliminating Duplicate Data in a System
Adam Dudáš1  Roman Čerešňák2  Karol Matiaško2 
[1] Department of Computer Science, Faculty of Natural Siences, Matej Bel University, Banska Bystrica, Slovakia;Department of Informatics, Faculty of Management Science and Informatics, University of Zilina, Zilina, Slovakia;
关键词: duplication data;    distributed data processing;    software architecture;   
DOI  :  10.26552/com.C.2021.4.A223-A232
来源: DOAJ
【 摘 要 】

The growth of big data processing market led to an increase in the overload of computation data centers, change of methods used in storing the data, communication between the computing units and computational time needed to process or edit the data. Methods of distributed or parallel data processing brought new problems related to computations with data which need to be examined. Unlike the conventional cloud services, a tight connection between the data and the computations is one of the main characteristics of the big data services. The computational tasks can be done only if relevant data are available. Three factors, which influence the speed and efficiency of data processing are - data duplicity, data integrity and data security. We are motivated to study the problems related to the growing time needed for data processing by optimizing these three factors in geographically distributed data centers.

【 授权许可】

Unknown   

  文献评价指标  
  下载次数:0次 浏览次数:0次