A Machine Learning Approach to Achieving Energy Efficiency in Relay-Assisted LTE-A Downlink System

Sensors (Basel). 2019 Aug 8;19(16):3461. doi: 10.3390/s19163461.

Abstract

In recent years, Energy Efficiency (EE) has become a critical design metric for cellular systems. In order to achieve EE, a fine balance between throughput and fairness must also be ensured. To this end, in this paper we have presented various resource block (RB) allocation schemes in relay-assisted Long Term Evolution-Advanced (LTE-A) networks. Driven by equal power and Bisection-based Power Allocation (BOPA) algorithm, the Maximum Throughput (MT) and an alternating MT and proportional fairness (PF)-based SAMM (abbreviated with Authors' names) RB allocation scheme is presented for a single relay. In the case of multiple relays, the dependency of RB and power allocation on relay deployment and users' association is first addressed through a k-mean clustering approach. Secondly, to reduce the computational cost of RB and power allocation, a two-step neural network (NN) process (SAMM NN) is presented that uses SAMM-based unsupervised learning for RB allocation and BOPA-based supervised learning for power allocation. The results for all the schemes are compared in terms of EE and user throughput. For a single relay, SAMM BOPA offers the best EE, whereas SAMM equal power provides the best fairness. In the case of multiple relays, the results indicate SAMM NN achieves better EE compared to SAMM equal power and BOPA, and it also achieves better throughput fairness compared to MT equal power and MT BOPA.

Keywords: LTE-A; bisection based optimal power allocation; energy efficiency; machine learning; proportional rate constraint; resource block allocation; water filling algorithm.