论文标题

基于云的无线网络的计算资源分配的强化学习

Reinforcement Learning on Computational Resource Allocation of Cloud-based Wireless Networks

论文作者

Chen, Beiran, Zhang, Yi, Iosifidis, George, Liu, Mingming

论文摘要

预计用于物联网的无线网络(IoT)在很大程度上涉及基于云的计算和处理。云中的软焊接和集中的信号处理和网络切换可以灵活的网络控制和管理。在云环境中,动态计算资源分配对于节省能源的同时保持过程的性能至关重要。中央处理单元(CPU)负载变化的随机特征以及云过程可能的复杂并行化情况,使动态资源分配成为有趣的研究挑战。本文将此动态计算资源分配问题建模为马尔可夫决策过程(MDP),并设计基于模型的增强学习代理,以优化CPU使用的动态资源分配。价值迭代方法用于增强学习代理在MDP期间拾取最佳策略。为了评估我们的性能,我们分析了可以在具有不同级别的并行功能的基于云的物联网网络中使用的两种类型的过程,即软件定义的无线电(SDR)和软件定义的网络(SDN)。结果表明,与不同方案的基线算法相比,我们的代理商迅速收敛到最佳策略,在不同的参数设置中稳定地表现,表现优于或至少同样执行。

Wireless networks used for Internet of Things (IoT) are expected to largely involve cloud-based computing and processing. Softwarised and centralised signal processing and network switching in the cloud enables flexible network control and management. In a cloud environment, dynamic computational resource allocation is essential to save energy while maintaining the performance of the processes. The stochastic features of the Central Processing Unit (CPU) load variation as well as the possible complex parallelisation situations of the cloud processes makes the dynamic resource allocation an interesting research challenge. This paper models this dynamic computational resource allocation problem into a Markov Decision Process (MDP) and designs a model-based reinforcement-learning agent to optimise the dynamic resource allocation of the CPU usage. Value iteration method is used for the reinforcement-learning agent to pick up the optimal policy during the MDP. To evaluate our performance we analyse two types of processes that can be used in the cloud-based IoT networks with different levels of parallelisation capabilities, i.e., Software-Defined Radio (SDR) and Software-Defined Networking (SDN). The results show that our agent rapidly converges to the optimal policy, stably performs in different parameter settings, outperforms or at least equally performs compared to a baseline algorithm in energy savings for different scenarios.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源