Publications
Detailed Information
Energy-Efficient Ultra-Dense Network using Deep Reinforcement Learning
Cited 2 time in
Web of Science
Cited 11 time in Scopus
- Authors
- Issue Date
- 2020-05
- Publisher
- IEEE
- Citation
- PROCEEDINGS OF THE 21ST IEEE INTERNATIONAL WORKSHOP ON SIGNAL PROCESSING ADVANCES IN WIRELESS COMMUNICATIONS (IEEE SPAWC2020), p. 9154261
- Abstract
- With the explosive growth in mobile data traffic, pursuing energy efficiency has become one of key challenges for the next generation communication systems. In recent years, an approach to reduce the energy consumption of base stations (BSs) by selectively turning off the BSs, referred to as the sleep mode technique, has been suggested. However, due to the macro-cell oriented network operation and also computational overhead, this approach has not been so successful in the past. In this paper, we propose an approach to determine the BS active/sleep mode of ultra-dense network (UDN) using deep reinforcement learning (DRL). A key ingredient of the proposed scheme is to use action elimination network to reduce the wide action space (active/sleep mode selection). Numerical results show that the proposed scheme can significantly reduce the energy consumption of UDN while ensuring the QoS requirement of the network.
- ISSN
- 2325-3789
- Files in This Item:
- There are no files associated with this item.
Item View & Download Count
Items in S-Space are protected by copyright, with all rights reserved, unless otherwise indicated.