期刊论文详细信息
Applied Sciences
A Reinforcement Learning Based Data Caching in Wireless Networks
Muhammad Sheraz1  Shahryar Shafique1  Sohail Imran1  Jahanzeb Khan1  Muhammad Asif2  Muhammad Ibrar3  Lunchakorn Wuttisittikulkij4  Rizwan Ullah4 
[1] Department of Electrical Engineering, Iqra National University, Peshawar 25000, Pakistan;Department of Electrical Engineering, Main Campus, University of Science & Technology, Bannu 28100, Pakistan;Department of Physics, Islamia College Peshawar, Peshawar 25000, Pakistan;Wireless Communication Ecosystem Research Unit, Department of Electrical Engineering, Chulalongkorn University, Bangkok 10330, Thailand;
关键词: caching;    network delay;    small base station;    5G;    dynamic data popularity;    reinforcement learning;   
DOI  :  10.3390/app12115692
来源: DOAJ
【 摘 要 】

Data caching has emerged as a promising technique to handle growing data traffic and backhaul congestion of wireless networks. However, there is a concern regarding how and where to place contents to optimize data access by the users. Data caching can be exploited close to users by deploying cache entities at Small Base Stations (SBSs). In this approach, SBSs cache contents through the core network during off-peak traffic hours. Then, SBSs provide cached contents to content-demanding users during peak traffic hours with low latency. In this paper, we exploit the potential of data caching at the SBS level to minimize data access delay. We propose an intelligence-based data caching mechanism inspired by an artificial intelligence approach known as Reinforcement Learning (RL). Our proposed RL-based data caching mechanism is adaptive to dynamic learning and tracks network states to capture users’ diverse and varying data demands. Our proposed approach optimizes data caching at the SBS level by observing users’ data demands and locations to efficiently utilize the limited cache resources of SBS. Extensive simulations are performed to evaluate the performance of proposed caching mechanism based on various factors such as caching capacity, data library size, etc. The obtained results demonstrate that our proposed caching mechanism achieves 4% performance gain in terms of delay vs. contents, 3.5% performance gain in terms of delay vs. users, 2.6% performance gain in terms of delay vs. cache capacity, 18% performance gain in terms of percentage traffic offloading vs. popularity skewness (γ), and 6% performance gain in terms of backhaul saving vs. cache capacity.

【 授权许可】

Unknown   

  文献评价指标  
  下载次数:0次 浏览次数:2次