EORL: Energy Optimization via Reinforcement Learning in Software-Defined Wireless Sensor Networks

dc.contributor.authorArnut Boonlert
dc.contributor.authorChotipat Pornavalai
dc.contributor.authorPanwit Tuwanut
dc.contributor.authorSarayoot Tanessakulwattana
dc.date.accessioned2026-05-08T19:24:25Z
dc.date.issued2024-11-14
dc.description.abstractA wireless sensor network is a collection of sensors placed in a particular area to collect and transmit data to the base station or sink. They usually have batteries as the primary power sources. If they work for a long time, their energy will be exhausted. Replacing the battery may not be cost-effective compared to developing an algorithm that optimizes energy efficiency to extend the network lifetime. This work optimizes the energy consumption of wireless sensor networks by adaptively selecting an optimal routing path in a Software-defined Wireless Sensor Networks (SDWSN) environment. A concept of the energy balance among nodes according to the current network status by the SDWSN controller using Reinforcement Learning (RL) is introduced. We propose energy optimization via reinforcement learning (EORL) for SDWSN using a minimum energy reward function and state design that considers energy consumption. The EORL algorithm then identifies the node that requires attention and selects the most energy-efficient path for that node. The performance of the EORL shows that it has a more extended network lifetime compared with other RL solutions.
dc.identifier.doi10.1109/incit63192.2024.10810492
dc.identifier.urihttps://dspace.kmitl.ac.th/handle/123456789/19581
dc.subjectEnergy Harvesting in Wireless Networks
dc.subjectEnergy Efficient Wireless Sensor Networks
dc.subjectSoftware-Defined Networks and 5G
dc.titleEORL: Energy Optimization via Reinforcement Learning in Software-Defined Wireless Sensor Networks
dc.typeArticle

Files

Collections