In this article, we propose a novel cooperative hierarchical caching framework in a Cloud Radio Access Network (C-RAN), in which a new cloud-cache at Cloud Processing Unit (CPU) is envisioned to bridge the storage-capacity/delay-performance gap between the traditional edge-based and core-based caching paradigms. A delay-cost model is introduced and the cache placement problem is formulated that aims at minimizing the average delay-cost of content delivery in the network. Given the NP-completeness of the cache placement problem, we propose a low-complexity heuristic cache-management strategy comprising of a proactive cache-distribution algorithm and a reactive cache-replacement algorithm. Furthermore, a Cache-Aware Request Scheduling (CARS) algorithm is devised in order to optimize online the tradeoff between content download rate and content access delay. Via extensive numerical simulations - carried out using both real-world YouTube video requests and synthetic content requests - it is demonstrated that the proposed cache-management strategy outperforms traditional caching strategies in terms of cache hit ratio, average content access delay, and backhaul traffic load. Additionally, it is shown that the proposed CARS algorithm achieves superior tradeoff performance over traditional approaches that optimize either users' rate or access delay alone.
All Science Journal Classification (ASJC) codes
- Computer Networks and Communications
- Electrical and Electronic Engineering
- Cooperative caching
- cloud radio access networks
- content request scheduling
- hierarchical caching