期刊论文详细信息
Journal of Big Data
Sandbox security model for Hadoop file system
S. Zahoor Ul Huq1  A. P. Siva Kumar2 
[1] Department of Computer Science and Engineering, GPREC, Kurnool, Andhra Pradesh, India;Department of Computer Science and Engineering, JNTUA, Anantapuramu, Andhra Pradesh, India;
关键词: HDFS;    MapReduce;    Fsimage;    Hadoop;    Kerberos;   
DOI  :  10.1186/s40537-020-00356-z
来源: Springer
PDF
【 摘 要 】

Extensive usage of Internet based applications in day to day life has led to generation of huge amounts of data every minute. Apart from humans, data is generated by machines like sensors, satellite, CCTV etc. This huge collection of heterogeneous data is often referred as Big Data which can be processed to draw useful insights. Apache Hadoop has emerged has widely used open source software framework for Big Data Processing and it is a cluster of cooperative computers enabling distributed parallel processing. Hadoop Distributed File System is used to store data blocks replicated and spanned across different nodes. HDFS uses an AES based cryptographic techniques at block level which is transparent and end to end in nature. However cryptography provides security from unauthorized access to the data blocks, but a legitimate user can still harm the data. One such example was execution of malicious map reduce jar files by legitimate user which can harm the data in the HDFS. We developed a mechanism where every map reduce jar will be tested by our sandbox security to ensure the jar is not malicious and suspicious jar files are not allowed to process the data in the HDFS. This feature is not present in the existing Apache Hadoop framework and our work is made available in github for consideration and inclusion in the future versions of Apache Hadoop.

【 授权许可】

CC BY   

【 预 览 】
附件列表
Files Size Format View
RO202104264650267ZK.pdf 1199KB PDF download
  文献评价指标  
  下载次数:4次 浏览次数:17次